id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
yzhuang/autotree_automl_california_sgosdt_l256_d3_sd0 | 2023-08-30T14:39:27.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 185200000
num_examples: 10000
- name: validation
num_bytes: 185200000
num_examples: 10000
download_size: 149978405
dataset_size: 370400000
---
# Dataset Card for "autotree_automl_california_sgosdt_l256_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NobodyExistsOnTheInternet/1000max464 | 2023-08-29T13:21:35.000Z | [
"license:mit",
"region:us"
] | NobodyExistsOnTheInternet | null | null | null | 0 | 0 | ---
license: mit
---
|
NobodyExistsOnTheInternet/1500max463 | 2023-08-29T13:22:51.000Z | [
"license:mit",
"region:us"
] | NobodyExistsOnTheInternet | null | null | null | 0 | 0 | ---
license: mit
---
|
rajamamoon/potential | 2023-08-29T13:39:33.000Z | [
"region:us"
] | rajamamoon | null | null | null | 0 | 0 | Entry not found |
jiiiM/poh | 2023-08-29T14:26:54.000Z | [
"license:other",
"region:us"
] | jiiiM | null | null | null | 0 | 0 | ---
license: other
---
|
NobodyExistsOnTheInternet/1000sub463 | 2023-08-29T13:24:25.000Z | [
"license:mit",
"region:us"
] | NobodyExistsOnTheInternet | null | null | null | 0 | 0 | ---
license: mit
---
|
niting3c/Malicious_packets_subset | 2023-08-29T13:29:21.000Z | [
"license:mit",
"region:us"
] | niting3c | null | null | null | 0 | 0 | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': attack
'1': normal
splits:
- name: train
num_bytes: 17251473.27644231
num_examples: 10000
- name: test
num_bytes: 1725147.3276442313
num_examples: 1000
download_size: 10373120
dataset_size: 18976620.604086544
---
|
loubnabnl/wizardcoder-python-34b-generations | 2023-08-29T13:46:38.000Z | [
"region:us"
] | loubnabnl | null | null | null | 0 | 0 | Entry not found |
hgjc23/data | 2023-09-02T23:23:13.000Z | [
"region:us"
] | hgjc23 | null | null | null | 0 | 0 | Entry not found |
marasama/nva-minerva_the_exalted_lightsworn | 2023-08-29T14:01:51.000Z | [
"region:us"
] | marasama | null | null | null | 0 | 0 | Entry not found |
Roscall/Elvis70s | 2023-08-29T14:04:13.000Z | [
"region:us"
] | Roscall | null | null | null | 0 | 0 | Entry not found |
marckohlbrugge/test123 | 2023-08-29T14:08:08.000Z | [
"region:us"
] | marckohlbrugge | null | null | null | 0 | 0 | Entry not found |
NobodyExistsOnTheInternet/Chem2800ctx | 2023-08-29T14:32:35.000Z | [
"license:mit",
"region:us"
] | NobodyExistsOnTheInternet | null | null | null | 0 | 0 | ---
license: mit
---
|
nikchar/retrieved_claims_val | 2023-08-31T14:58:06.000Z | [
"region:us"
] | nikchar | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: label
dtype: string
- name: claim
dtype: string
- name: evidence_wiki_url
dtype: string
- name: retrieved_evidence
sequence: string
- name: retrieval_score
sequence: float64
- name: id
dtype: string
- name: text
dtype: string
- name: lines
dtype: string
splits:
- name: train
num_bytes: 6038497
num_examples: 1500
download_size: 2990657
dataset_size: 6038497
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "retrieved_claims_val"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CineAI/Bald-ds | 2023-08-29T14:41:31.000Z | [
"region:us"
] | CineAI | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_TheBloke__Llama-2-13B-GPTQ | 2023-08-31T11:14:02.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheBloke/Llama-2-13B-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/Llama-2-13B-GPTQ](https://huggingface.co/TheBloke/Llama-2-13B-GPTQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Llama-2-13B-GPTQ\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-31T11:12:42.998068](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Llama-2-13B-GPTQ/blob/main/results_2023-08-31T11%3A12%3A42.998068.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5455217135882148,\n\
\ \"acc_stderr\": 0.03452354370556732,\n \"acc_norm\": 0.5498379148078576,\n\
\ \"acc_norm_stderr\": 0.03450345490667003,\n \"mc1\": 0.25703794369645044,\n\
\ \"mc1_stderr\": 0.01529807750948508,\n \"mc2\": 0.37071042385732017,\n\
\ \"mc2_stderr\": 0.01376534132094419\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5443686006825939,\n \"acc_stderr\": 0.014553749939306861,\n\
\ \"acc_norm\": 0.591296928327645,\n \"acc_norm_stderr\": 0.014365750345427\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6070503883688508,\n\
\ \"acc_stderr\": 0.00487407625052158,\n \"acc_norm\": 0.8147779326827326,\n\
\ \"acc_norm_stderr\": 0.003876836709461133\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5333333333333333,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.5333333333333333,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.04046336883978251,\n\
\ \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.04046336883978251\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5773584905660377,\n \"acc_stderr\": 0.030402331445769544,\n\
\ \"acc_norm\": 0.5773584905660377,\n \"acc_norm_stderr\": 0.030402331445769544\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5625,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.5625,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n\
\ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.5317919075144508,\n\
\ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30158730158730157,\n \"acc_stderr\": 0.023636975996101806,\n \"\
acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.023636975996101806\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6548387096774193,\n\
\ \"acc_stderr\": 0.02704574657353433,\n \"acc_norm\": 0.6548387096774193,\n\
\ \"acc_norm_stderr\": 0.02704574657353433\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4039408866995074,\n \"acc_stderr\": 0.0345245390382204,\n\
\ \"acc_norm\": 0.4039408866995074,\n \"acc_norm_stderr\": 0.0345245390382204\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.037563357751878974,\n\
\ \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.037563357751878974\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.03358618145732522,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03358618145732522\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7823834196891192,\n \"acc_stderr\": 0.029778663037752954,\n\
\ \"acc_norm\": 0.7823834196891192,\n \"acc_norm_stderr\": 0.029778663037752954\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4897435897435897,\n \"acc_stderr\": 0.025345672221942374,\n\
\ \"acc_norm\": 0.4897435897435897,\n \"acc_norm_stderr\": 0.025345672221942374\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683522,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683522\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5546218487394958,\n \"acc_stderr\": 0.032284106267163895,\n\
\ \"acc_norm\": 0.5546218487394958,\n \"acc_norm_stderr\": 0.032284106267163895\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7357798165137615,\n \"acc_stderr\": 0.018904164171510193,\n \"\
acc_norm\": 0.7357798165137615,\n \"acc_norm_stderr\": 0.018904164171510193\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4537037037037037,\n \"acc_stderr\": 0.03395322726375797,\n \"\
acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.03395322726375797\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7107843137254902,\n \"acc_stderr\": 0.031822318676475544,\n \"\
acc_norm\": 0.7107843137254902,\n \"acc_norm_stderr\": 0.031822318676475544\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7172995780590717,\n \"acc_stderr\": 0.02931281415395593,\n \
\ \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.02931281415395593\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870254,\n\
\ \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870254\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6694214876033058,\n \"acc_stderr\": 0.04294340845212094,\n \"\
acc_norm\": 0.6694214876033058,\n \"acc_norm_stderr\": 0.04294340845212094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6759259259259259,\n\
\ \"acc_stderr\": 0.04524596007030048,\n \"acc_norm\": 0.6759259259259259,\n\
\ \"acc_norm_stderr\": 0.04524596007030048\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935575,\n\
\ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935575\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.04246624336697625,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.04246624336697625\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7905982905982906,\n\
\ \"acc_stderr\": 0.026655699653922726,\n \"acc_norm\": 0.7905982905982906,\n\
\ \"acc_norm_stderr\": 0.026655699653922726\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7343550446998723,\n\
\ \"acc_stderr\": 0.01579430248788873,\n \"acc_norm\": 0.7343550446998723,\n\
\ \"acc_norm_stderr\": 0.01579430248788873\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.025992472029306393,\n\
\ \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.025992472029306393\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31731843575418994,\n\
\ \"acc_stderr\": 0.01556639263005703,\n \"acc_norm\": 0.31731843575418994,\n\
\ \"acc_norm_stderr\": 0.01556639263005703\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6078431372549019,\n \"acc_stderr\": 0.027956046165424516,\n\
\ \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.027956046165424516\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6591639871382636,\n\
\ \"acc_stderr\": 0.026920841260776162,\n \"acc_norm\": 0.6591639871382636,\n\
\ \"acc_norm_stderr\": 0.026920841260776162\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6358024691358025,\n \"acc_stderr\": 0.026774929899722327,\n\
\ \"acc_norm\": 0.6358024691358025,\n \"acc_norm_stderr\": 0.026774929899722327\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3971631205673759,\n \"acc_stderr\": 0.0291898056735871,\n \
\ \"acc_norm\": 0.3971631205673759,\n \"acc_norm_stderr\": 0.0291898056735871\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4178617992177314,\n\
\ \"acc_stderr\": 0.01259674410899856,\n \"acc_norm\": 0.4178617992177314,\n\
\ \"acc_norm_stderr\": 0.01259674410899856\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4742647058823529,\n \"acc_stderr\": 0.030332578094555033,\n\
\ \"acc_norm\": 0.4742647058823529,\n \"acc_norm_stderr\": 0.030332578094555033\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5571895424836601,\n \"acc_stderr\": 0.02009508315457734,\n \
\ \"acc_norm\": 0.5571895424836601,\n \"acc_norm_stderr\": 0.02009508315457734\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6408163265306123,\n \"acc_stderr\": 0.03071356045510849,\n\
\ \"acc_norm\": 0.6408163265306123,\n \"acc_norm_stderr\": 0.03071356045510849\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7213930348258707,\n\
\ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.7213930348258707,\n\
\ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.033014059469872487,\n\
\ \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.033014059469872487\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25703794369645044,\n\
\ \"mc1_stderr\": 0.01529807750948508,\n \"mc2\": 0.37071042385732017,\n\
\ \"mc2_stderr\": 0.01376534132094419\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/Llama-2-13B-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|arc:challenge|25_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|arc:challenge|25_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|arc:challenge|25_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hellaswag|10_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hellaswag|10_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hellaswag|10_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T11:12:42.998068.parquet'
- config_name: results
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- results_2023-08-29T15:04:20.709230.parquet
- split: 2023_08_30T10_42_39.395336
path:
- results_2023-08-30T10:42:39.395336.parquet
- split: 2023_08_31T11_12_42.998068
path:
- results_2023-08-31T11:12:42.998068.parquet
- split: latest
path:
- results_2023-08-31T11:12:42.998068.parquet
---
# Dataset Card for Evaluation run of TheBloke/Llama-2-13B-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Llama-2-13B-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Llama-2-13B-GPTQ](https://huggingface.co/TheBloke/Llama-2-13B-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Llama-2-13B-GPTQ",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-31T11:12:42.998068](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Llama-2-13B-GPTQ/blob/main/results_2023-08-31T11%3A12%3A42.998068.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5455217135882148,
"acc_stderr": 0.03452354370556732,
"acc_norm": 0.5498379148078576,
"acc_norm_stderr": 0.03450345490667003,
"mc1": 0.25703794369645044,
"mc1_stderr": 0.01529807750948508,
"mc2": 0.37071042385732017,
"mc2_stderr": 0.01376534132094419
},
"harness|arc:challenge|25": {
"acc": 0.5443686006825939,
"acc_stderr": 0.014553749939306861,
"acc_norm": 0.591296928327645,
"acc_norm_stderr": 0.014365750345427
},
"harness|hellaswag|10": {
"acc": 0.6070503883688508,
"acc_stderr": 0.00487407625052158,
"acc_norm": 0.8147779326827326,
"acc_norm_stderr": 0.003876836709461133
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.04046336883978251,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.04046336883978251
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5773584905660377,
"acc_stderr": 0.030402331445769544,
"acc_norm": 0.5773584905660377,
"acc_norm_stderr": 0.030402331445769544
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5625,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171453,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171453
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4425531914893617,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.4425531914893617,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.023636975996101806,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.023636975996101806
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574924,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574924
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6548387096774193,
"acc_stderr": 0.02704574657353433,
"acc_norm": 0.6548387096774193,
"acc_norm_stderr": 0.02704574657353433
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4039408866995074,
"acc_stderr": 0.0345245390382204,
"acc_norm": 0.4039408866995074,
"acc_norm_stderr": 0.0345245390382204
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.037563357751878974,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.037563357751878974
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03358618145732522,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03358618145732522
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7823834196891192,
"acc_stderr": 0.029778663037752954,
"acc_norm": 0.7823834196891192,
"acc_norm_stderr": 0.029778663037752954
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4897435897435897,
"acc_stderr": 0.025345672221942374,
"acc_norm": 0.4897435897435897,
"acc_norm_stderr": 0.025345672221942374
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683522,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683522
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5546218487394958,
"acc_stderr": 0.032284106267163895,
"acc_norm": 0.5546218487394958,
"acc_norm_stderr": 0.032284106267163895
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7357798165137615,
"acc_stderr": 0.018904164171510193,
"acc_norm": 0.7357798165137615,
"acc_norm_stderr": 0.018904164171510193
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.03395322726375797,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.03395322726375797
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7107843137254902,
"acc_stderr": 0.031822318676475544,
"acc_norm": 0.7107843137254902,
"acc_norm_stderr": 0.031822318676475544
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7172995780590717,
"acc_stderr": 0.02931281415395593,
"acc_norm": 0.7172995780590717,
"acc_norm_stderr": 0.02931281415395593
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870254,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870254
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6694214876033058,
"acc_stderr": 0.04294340845212094,
"acc_norm": 0.6694214876033058,
"acc_norm_stderr": 0.04294340845212094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.04524596007030048,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.04524596007030048
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935575,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935575
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.04246624336697625,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.04246624336697625
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7905982905982906,
"acc_stderr": 0.026655699653922726,
"acc_norm": 0.7905982905982906,
"acc_norm_stderr": 0.026655699653922726
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7343550446998723,
"acc_stderr": 0.01579430248788873,
"acc_norm": 0.7343550446998723,
"acc_norm_stderr": 0.01579430248788873
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.025992472029306393,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.025992472029306393
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31731843575418994,
"acc_stderr": 0.01556639263005703,
"acc_norm": 0.31731843575418994,
"acc_norm_stderr": 0.01556639263005703
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.027956046165424516,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.027956046165424516
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6591639871382636,
"acc_stderr": 0.026920841260776162,
"acc_norm": 0.6591639871382636,
"acc_norm_stderr": 0.026920841260776162
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6358024691358025,
"acc_stderr": 0.026774929899722327,
"acc_norm": 0.6358024691358025,
"acc_norm_stderr": 0.026774929899722327
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3971631205673759,
"acc_stderr": 0.0291898056735871,
"acc_norm": 0.3971631205673759,
"acc_norm_stderr": 0.0291898056735871
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4178617992177314,
"acc_stderr": 0.01259674410899856,
"acc_norm": 0.4178617992177314,
"acc_norm_stderr": 0.01259674410899856
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4742647058823529,
"acc_stderr": 0.030332578094555033,
"acc_norm": 0.4742647058823529,
"acc_norm_stderr": 0.030332578094555033
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5571895424836601,
"acc_stderr": 0.02009508315457734,
"acc_norm": 0.5571895424836601,
"acc_norm_stderr": 0.02009508315457734
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6408163265306123,
"acc_stderr": 0.03071356045510849,
"acc_norm": 0.6408163265306123,
"acc_norm_stderr": 0.03071356045510849
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7213930348258707,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.7213930348258707,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7543859649122807,
"acc_stderr": 0.033014059469872487,
"acc_norm": 0.7543859649122807,
"acc_norm_stderr": 0.033014059469872487
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25703794369645044,
"mc1_stderr": 0.01529807750948508,
"mc2": 0.37071042385732017,
"mc2_stderr": 0.01376534132094419
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
hbt2023/bug | 2023-08-29T15:07:42.000Z | [
"region:us"
] | hbt2023 | null | null | null | 0 | 0 | Entry not found |
ihixehima/ihixehim | 2023-08-29T15:41:31.000Z | [
"region:us"
] | ihixehima | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v4-7B | 2023-08-29T15:17:23.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of xzuyn/LLaMa-2-PeanutButter_v4-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [xzuyn/LLaMa-2-PeanutButter_v4-7B](https://huggingface.co/xzuyn/LLaMa-2-PeanutButter_v4-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v4-7B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T15:15:59.631802](https://huggingface.co/datasets/open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v4-7B/blob/main/results_2023-08-29T15%3A15%3A59.631802.json):\n\
\n```python\n{\n \"all\": {\n \"acc\": 0.4754535953456773,\n \"\
acc_stderr\": 0.03543074449128995,\n \"acc_norm\": 0.4793512530654778,\n\
\ \"acc_norm_stderr\": 0.03541409593269912,\n \"mc1\": 0.26805385556915545,\n\
\ \"mc1_stderr\": 0.015506204722834557,\n \"mc2\": 0.42310904021377665,\n\
\ \"mc2_stderr\": 0.015624011969941223\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.507679180887372,\n \"acc_stderr\": 0.014609667440892567,\n\
\ \"acc_norm\": 0.5486348122866894,\n \"acc_norm_stderr\": 0.014542104569955265\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6188010356502689,\n\
\ \"acc_stderr\": 0.004846886929763466,\n \"acc_norm\": 0.8078072097191794,\n\
\ \"acc_norm_stderr\": 0.003932184843841659\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4276315789473684,\n \"acc_stderr\": 0.040260970832965585,\n\
\ \"acc_norm\": 0.4276315789473684,\n \"acc_norm_stderr\": 0.040260970832965585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4867924528301887,\n \"acc_stderr\": 0.030762134874500482,\n\
\ \"acc_norm\": 0.4867924528301887,\n \"acc_norm_stderr\": 0.030762134874500482\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04181210050035455,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04181210050035455\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n\
\ \"acc_stderr\": 0.03778621079092056,\n \"acc_norm\": 0.43352601156069365,\n\
\ \"acc_norm_stderr\": 0.03778621079092056\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146267,\n\
\ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146267\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.041443118108781506,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.041443118108781506\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30158730158730157,\n \"acc_stderr\": 0.023636975996101796,\n \"\
acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.023636975996101796\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.04073524322147126,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.04073524322147126\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5032258064516129,\n\
\ \"acc_stderr\": 0.028443414226438316,\n \"acc_norm\": 0.5032258064516129,\n\
\ \"acc_norm_stderr\": 0.028443414226438316\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3694581280788177,\n \"acc_stderr\": 0.03395970381998573,\n\
\ \"acc_norm\": 0.3694581280788177,\n \"acc_norm_stderr\": 0.03395970381998573\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6242424242424243,\n \"acc_stderr\": 0.03781887353205982,\n\
\ \"acc_norm\": 0.6242424242424243,\n \"acc_norm_stderr\": 0.03781887353205982\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5909090909090909,\n \"acc_stderr\": 0.03502975799413007,\n \"\
acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.03502975799413007\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7202072538860104,\n \"acc_stderr\": 0.032396370467357036,\n\
\ \"acc_norm\": 0.7202072538860104,\n \"acc_norm_stderr\": 0.032396370467357036\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.025294608023986476,\n\
\ \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.025294608023986476\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945287,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945287\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4579831932773109,\n \"acc_stderr\": 0.03236361111951941,\n \
\ \"acc_norm\": 0.4579831932773109,\n \"acc_norm_stderr\": 0.03236361111951941\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6385321100917432,\n \"acc_stderr\": 0.020598082009937374,\n \"\
acc_norm\": 0.6385321100917432,\n \"acc_norm_stderr\": 0.020598082009937374\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.375,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.375,\n\
\ \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.5686274509803921,\n \"acc_stderr\": 0.03476099060501636,\n\
\ \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.03476099060501636\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5949367088607594,\n \"acc_stderr\": 0.03195514741370671,\n \
\ \"acc_norm\": 0.5949367088607594,\n \"acc_norm_stderr\": 0.03195514741370671\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5381165919282511,\n\
\ \"acc_stderr\": 0.03346015011973228,\n \"acc_norm\": 0.5381165919282511,\n\
\ \"acc_norm_stderr\": 0.03346015011973228\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5114503816793893,\n \"acc_stderr\": 0.043841400240780176,\n\
\ \"acc_norm\": 0.5114503816793893,\n \"acc_norm_stderr\": 0.043841400240780176\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5619834710743802,\n \"acc_stderr\": 0.04529146804435792,\n \"\
acc_norm\": 0.5619834710743802,\n \"acc_norm_stderr\": 0.04529146804435792\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4722222222222222,\n\
\ \"acc_stderr\": 0.04826217294139894,\n \"acc_norm\": 0.4722222222222222,\n\
\ \"acc_norm_stderr\": 0.04826217294139894\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.49693251533742333,\n \"acc_stderr\": 0.03928297078179663,\n\
\ \"acc_norm\": 0.49693251533742333,\n \"acc_norm_stderr\": 0.03928297078179663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6019417475728155,\n \"acc_stderr\": 0.04846748253977239,\n\
\ \"acc_norm\": 0.6019417475728155,\n \"acc_norm_stderr\": 0.04846748253977239\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.688034188034188,\n\
\ \"acc_stderr\": 0.030351527323344948,\n \"acc_norm\": 0.688034188034188,\n\
\ \"acc_norm_stderr\": 0.030351527323344948\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6155810983397191,\n\
\ \"acc_stderr\": 0.01739568874281962,\n \"acc_norm\": 0.6155810983397191,\n\
\ \"acc_norm_stderr\": 0.01739568874281962\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.476878612716763,\n \"acc_stderr\": 0.026890297881303128,\n\
\ \"acc_norm\": 0.476878612716763,\n \"acc_norm_stderr\": 0.026890297881303128\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2994413407821229,\n\
\ \"acc_stderr\": 0.015318257745976708,\n \"acc_norm\": 0.2994413407821229,\n\
\ \"acc_norm_stderr\": 0.015318257745976708\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5261437908496732,\n \"acc_stderr\": 0.028590752958852387,\n\
\ \"acc_norm\": 0.5261437908496732,\n \"acc_norm_stderr\": 0.028590752958852387\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5787781350482315,\n\
\ \"acc_stderr\": 0.02804339985821063,\n \"acc_norm\": 0.5787781350482315,\n\
\ \"acc_norm_stderr\": 0.02804339985821063\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5061728395061729,\n \"acc_stderr\": 0.027818623962583295,\n\
\ \"acc_norm\": 0.5061728395061729,\n \"acc_norm_stderr\": 0.027818623962583295\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3723404255319149,\n \"acc_stderr\": 0.02883892147125146,\n \
\ \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.02883892147125146\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.36766623207301175,\n\
\ \"acc_stderr\": 0.012314845910071691,\n \"acc_norm\": 0.36766623207301175,\n\
\ \"acc_norm_stderr\": 0.012314845910071691\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5367647058823529,\n \"acc_stderr\": 0.030290619180485694,\n\
\ \"acc_norm\": 0.5367647058823529,\n \"acc_norm_stderr\": 0.030290619180485694\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.434640522875817,\n \"acc_stderr\": 0.02005426920072646,\n \
\ \"acc_norm\": 0.434640522875817,\n \"acc_norm_stderr\": 0.02005426920072646\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.509090909090909,\n\
\ \"acc_stderr\": 0.04788339768702861,\n \"acc_norm\": 0.509090909090909,\n\
\ \"acc_norm_stderr\": 0.04788339768702861\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4897959183673469,\n \"acc_stderr\": 0.03200255347893783,\n\
\ \"acc_norm\": 0.4897959183673469,\n \"acc_norm_stderr\": 0.03200255347893783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6169154228855721,\n\
\ \"acc_stderr\": 0.0343751933733825,\n \"acc_norm\": 0.6169154228855721,\n\
\ \"acc_norm_stderr\": 0.0343751933733825\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7076023391812866,\n \"acc_stderr\": 0.03488647713457923,\n\
\ \"acc_norm\": 0.7076023391812866,\n \"acc_norm_stderr\": 0.03488647713457923\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26805385556915545,\n\
\ \"mc1_stderr\": 0.015506204722834557,\n \"mc2\": 0.42310904021377665,\n\
\ \"mc2_stderr\": 0.015624011969941223\n }\n}\n```"
repo_url: https://huggingface.co/xzuyn/LLaMa-2-PeanutButter_v4-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|arc:challenge|25_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hellaswag|10_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T15:15:59.631802.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T15:15:59.631802.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T15:15:59.631802.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T15:15:59.631802.parquet'
- config_name: results
data_files:
- split: 2023_08_29T15_15_59.631802
path:
- results_2023-08-29T15:15:59.631802.parquet
- split: latest
path:
- results_2023-08-29T15:15:59.631802.parquet
---
# Dataset Card for Evaluation run of xzuyn/LLaMa-2-PeanutButter_v4-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/xzuyn/LLaMa-2-PeanutButter_v4-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [xzuyn/LLaMa-2-PeanutButter_v4-7B](https://huggingface.co/xzuyn/LLaMa-2-PeanutButter_v4-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v4-7B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T15:15:59.631802](https://huggingface.co/datasets/open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v4-7B/blob/main/results_2023-08-29T15%3A15%3A59.631802.json):
```python
{
"all": {
"acc": 0.4754535953456773,
"acc_stderr": 0.03543074449128995,
"acc_norm": 0.4793512530654778,
"acc_norm_stderr": 0.03541409593269912,
"mc1": 0.26805385556915545,
"mc1_stderr": 0.015506204722834557,
"mc2": 0.42310904021377665,
"mc2_stderr": 0.015624011969941223
},
"harness|arc:challenge|25": {
"acc": 0.507679180887372,
"acc_stderr": 0.014609667440892567,
"acc_norm": 0.5486348122866894,
"acc_norm_stderr": 0.014542104569955265
},
"harness|hellaswag|10": {
"acc": 0.6188010356502689,
"acc_stderr": 0.004846886929763466,
"acc_norm": 0.8078072097191794,
"acc_norm_stderr": 0.003932184843841659
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464243,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464243
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4276315789473684,
"acc_stderr": 0.040260970832965585,
"acc_norm": 0.4276315789473684,
"acc_norm_stderr": 0.040260970832965585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4867924528301887,
"acc_stderr": 0.030762134874500482,
"acc_norm": 0.4867924528301887,
"acc_norm_stderr": 0.030762134874500482
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5,
"acc_stderr": 0.04181210050035455,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04181210050035455
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.43352601156069365,
"acc_stderr": 0.03778621079092056,
"acc_norm": 0.43352601156069365,
"acc_norm_stderr": 0.03778621079092056
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.041443118108781506,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.041443118108781506
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.023636975996101796,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.023636975996101796
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.04073524322147126,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.04073524322147126
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5032258064516129,
"acc_stderr": 0.028443414226438316,
"acc_norm": 0.5032258064516129,
"acc_norm_stderr": 0.028443414226438316
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3694581280788177,
"acc_stderr": 0.03395970381998573,
"acc_norm": 0.3694581280788177,
"acc_norm_stderr": 0.03395970381998573
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6242424242424243,
"acc_stderr": 0.03781887353205982,
"acc_norm": 0.6242424242424243,
"acc_norm_stderr": 0.03781887353205982
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.03502975799413007,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.03502975799413007
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7202072538860104,
"acc_stderr": 0.032396370467357036,
"acc_norm": 0.7202072538860104,
"acc_norm_stderr": 0.032396370467357036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.025294608023986476,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.025294608023986476
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945287,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945287
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4579831932773109,
"acc_stderr": 0.03236361111951941,
"acc_norm": 0.4579831932773109,
"acc_norm_stderr": 0.03236361111951941
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6385321100917432,
"acc_stderr": 0.020598082009937374,
"acc_norm": 0.6385321100917432,
"acc_norm_stderr": 0.020598082009937374
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.375,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.375,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.03476099060501636,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.03476099060501636
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5949367088607594,
"acc_stderr": 0.03195514741370671,
"acc_norm": 0.5949367088607594,
"acc_norm_stderr": 0.03195514741370671
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5381165919282511,
"acc_stderr": 0.03346015011973228,
"acc_norm": 0.5381165919282511,
"acc_norm_stderr": 0.03346015011973228
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5114503816793893,
"acc_stderr": 0.043841400240780176,
"acc_norm": 0.5114503816793893,
"acc_norm_stderr": 0.043841400240780176
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5619834710743802,
"acc_stderr": 0.04529146804435792,
"acc_norm": 0.5619834710743802,
"acc_norm_stderr": 0.04529146804435792
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.04826217294139894,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.04826217294139894
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.49693251533742333,
"acc_stderr": 0.03928297078179663,
"acc_norm": 0.49693251533742333,
"acc_norm_stderr": 0.03928297078179663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.6019417475728155,
"acc_stderr": 0.04846748253977239,
"acc_norm": 0.6019417475728155,
"acc_norm_stderr": 0.04846748253977239
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.688034188034188,
"acc_stderr": 0.030351527323344948,
"acc_norm": 0.688034188034188,
"acc_norm_stderr": 0.030351527323344948
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6155810983397191,
"acc_stderr": 0.01739568874281962,
"acc_norm": 0.6155810983397191,
"acc_norm_stderr": 0.01739568874281962
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.476878612716763,
"acc_stderr": 0.026890297881303128,
"acc_norm": 0.476878612716763,
"acc_norm_stderr": 0.026890297881303128
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2994413407821229,
"acc_stderr": 0.015318257745976708,
"acc_norm": 0.2994413407821229,
"acc_norm_stderr": 0.015318257745976708
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5261437908496732,
"acc_stderr": 0.028590752958852387,
"acc_norm": 0.5261437908496732,
"acc_norm_stderr": 0.028590752958852387
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5787781350482315,
"acc_stderr": 0.02804339985821063,
"acc_norm": 0.5787781350482315,
"acc_norm_stderr": 0.02804339985821063
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5061728395061729,
"acc_stderr": 0.027818623962583295,
"acc_norm": 0.5061728395061729,
"acc_norm_stderr": 0.027818623962583295
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3723404255319149,
"acc_stderr": 0.02883892147125146,
"acc_norm": 0.3723404255319149,
"acc_norm_stderr": 0.02883892147125146
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.36766623207301175,
"acc_stderr": 0.012314845910071691,
"acc_norm": 0.36766623207301175,
"acc_norm_stderr": 0.012314845910071691
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5367647058823529,
"acc_stderr": 0.030290619180485694,
"acc_norm": 0.5367647058823529,
"acc_norm_stderr": 0.030290619180485694
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.434640522875817,
"acc_stderr": 0.02005426920072646,
"acc_norm": 0.434640522875817,
"acc_norm_stderr": 0.02005426920072646
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.509090909090909,
"acc_stderr": 0.04788339768702861,
"acc_norm": 0.509090909090909,
"acc_norm_stderr": 0.04788339768702861
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4897959183673469,
"acc_stderr": 0.03200255347893783,
"acc_norm": 0.4897959183673469,
"acc_norm_stderr": 0.03200255347893783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6169154228855721,
"acc_stderr": 0.0343751933733825,
"acc_norm": 0.6169154228855721,
"acc_norm_stderr": 0.0343751933733825
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7076023391812866,
"acc_stderr": 0.03488647713457923,
"acc_norm": 0.7076023391812866,
"acc_norm_stderr": 0.03488647713457923
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26805385556915545,
"mc1_stderr": 0.015506204722834557,
"mc2": 0.42310904021377665,
"mc2_stderr": 0.015624011969941223
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Linhz/qg_viquad_20samples | 2023-08-29T15:57:08.000Z | [
"region:us"
] | Linhz | null | null | null | 0 | 0 | Entry not found |
fadingNA/f1-seneca | 2023-08-30T22:46:29.000Z | [
"license:openrail",
"region:us"
] | fadingNA | null | null | null | 0 | 0 | ---
license: openrail
---
|
tyzhu/fwv2_random_rare_train_10_eval_10 | 2023-08-29T16:07:18.000Z | [
"region:us"
] | tyzhu | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 3025
num_examples: 30
- name: train_doc2id
num_bytes: 1821
num_examples: 20
- name: train_id2doc
num_bytes: 1881
num_examples: 20
- name: train_find_word
num_bytes: 1144
num_examples: 10
- name: eval_find_word
num_bytes: 1136
num_examples: 10
- name: id_context_mapping
num_bytes: 1241
num_examples: 20
download_size: 0
dataset_size: 10248
---
# Dataset Card for "fwv2_random_rare_train_10_eval_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
abdulrub/fashion-products | 2023-08-29T17:52:33.000Z | [
"license:apache-2.0",
"region:us"
] | abdulrub | null | null | null | 0 | 0 | ---
license: apache-2.0
dataset_info:
features:
- name: id
struct:
- name: Brand
dtype: string
- name: Colour
dtype: string
- name: Description
dtype: string
- name: Price in Rupees
dtype: float64
- name: Product Name
dtype: string
- name: Rating
dtype: float64
- name: Rating Count
dtype: float64
- name: id
dtype: float64
- name: image
dtype: string
splits:
- name: train
num_bytes: 9243
num_examples: 15
download_size: 11175
dataset_size: 9243
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mrm8488/FloCo_train | 2023-08-29T15:37:44.000Z | [
"region:us"
] | mrm8488 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: common_id
dtype: string
- name: image
dtype: string
- name: code
dtype: string
splits:
- name: train
num_bytes: 1530119
num_examples: 10102
download_size: 843087
dataset_size: 1530119
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "FloCo_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mrm8488/FloCo_val | 2023-08-29T15:37:54.000Z | [
"region:us"
] | mrm8488 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: common_id
dtype: string
- name: image
dtype: string
- name: code
dtype: string
splits:
- name: train
num_bytes: 156818
num_examples: 594
download_size: 67865
dataset_size: 156818
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "FloCo_val"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mrm8488/FloCo_test | 2023-08-29T15:38:02.000Z | [
"region:us"
] | mrm8488 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: common_id
dtype: string
- name: image
dtype: string
- name: code
dtype: string
splits:
- name: train
num_bytes: 268542
num_examples: 1188
download_size: 133517
dataset_size: 268542
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "FloCo_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_DanielSc4__RedPajama-INCITE-Chat-3B-v1-RL-LoRA-8bit-test1 | 2023-08-29T15:43:19.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of DanielSc4/RedPajama-INCITE-Chat-3B-v1-RL-LoRA-8bit-test1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DanielSc4/RedPajama-INCITE-Chat-3B-v1-RL-LoRA-8bit-test1](https://huggingface.co/DanielSc4/RedPajama-INCITE-Chat-3B-v1-RL-LoRA-8bit-test1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DanielSc4__RedPajama-INCITE-Chat-3B-v1-RL-LoRA-8bit-test1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T15:41:04.762236](https://huggingface.co/datasets/open-llm-leaderboard/details_DanielSc4__RedPajama-INCITE-Chat-3B-v1-RL-LoRA-8bit-test1/blob/main/results_2023-08-29T15%3A41%3A04.762236.json):\n\
\n```python\n{\n \"all\": {\n \"acc\": 0.26680324768439967,\n \"\
acc_stderr\": 0.03192521196777,\n \"acc_norm\": 0.27043158188605154,\n \
\ \"acc_norm_stderr\": 0.031924647823271375,\n \"mc1\": 0.2178702570379437,\n\
\ \"mc1_stderr\": 0.014450846714123895,\n \"mc2\": 0.3504462244325371,\n\
\ \"mc2_stderr\": 0.01326389716056673\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3728668941979522,\n \"acc_stderr\": 0.01413117676013116,\n\
\ \"acc_norm\": 0.4129692832764505,\n \"acc_norm_stderr\": 0.014388344935398324\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.49422425811591314,\n\
\ \"acc_stderr\": 0.004989448490164431,\n \"acc_norm\": 0.6681935869348735,\n\
\ \"acc_norm_stderr\": 0.004698995789478812\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
\ \"acc_stderr\": 0.03749850709174022,\n \"acc_norm\": 0.2518518518518518,\n\
\ \"acc_norm_stderr\": 0.03749850709174022\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2830188679245283,\n \"acc_stderr\": 0.027724236492700904,\n\
\ \"acc_norm\": 0.2830188679245283,\n \"acc_norm_stderr\": 0.027724236492700904\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n\
\ \"acc_stderr\": 0.03716177437566016,\n \"acc_norm\": 0.2708333333333333,\n\
\ \"acc_norm_stderr\": 0.03716177437566016\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206824,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206824\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.26011560693641617,\n\
\ \"acc_stderr\": 0.033450369167889925,\n \"acc_norm\": 0.26011560693641617,\n\
\ \"acc_norm_stderr\": 0.033450369167889925\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.044405219061793254,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.044405219061793254\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2170212765957447,\n \"acc_stderr\": 0.026947483121496228,\n\
\ \"acc_norm\": 0.2170212765957447,\n \"acc_norm_stderr\": 0.026947483121496228\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.03999423879281336,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.03999423879281336\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2275132275132275,\n \"acc_stderr\": 0.021591269407823778,\n \"\
acc_norm\": 0.2275132275132275,\n \"acc_norm_stderr\": 0.021591269407823778\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.038095238095238106,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.038095238095238106\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2645161290322581,\n\
\ \"acc_stderr\": 0.02509189237885928,\n \"acc_norm\": 0.2645161290322581,\n\
\ \"acc_norm_stderr\": 0.02509189237885928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2315270935960591,\n \"acc_stderr\": 0.029678333141444455,\n\
\ \"acc_norm\": 0.2315270935960591,\n \"acc_norm_stderr\": 0.029678333141444455\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.0340150671524904,\n\
\ \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.0340150671524904\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.36363636363636365,\n \"acc_stderr\": 0.034273086529999344,\n \"\
acc_norm\": 0.36363636363636365,\n \"acc_norm_stderr\": 0.034273086529999344\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.2849740932642487,\n \"acc_stderr\": 0.032577140777096614,\n\
\ \"acc_norm\": 0.2849740932642487,\n \"acc_norm_stderr\": 0.032577140777096614\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2692307692307692,\n \"acc_stderr\": 0.02248938979365481,\n \
\ \"acc_norm\": 0.2692307692307692,\n \"acc_norm_stderr\": 0.02248938979365481\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24369747899159663,\n \"acc_stderr\": 0.027886828078380572,\n\
\ \"acc_norm\": 0.24369747899159663,\n \"acc_norm_stderr\": 0.027886828078380572\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3321100917431193,\n\
\ \"acc_stderr\": 0.020192682985423344,\n \"acc_norm\": 0.3321100917431193,\n\
\ \"acc_norm_stderr\": 0.020192682985423344\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.2824074074074074,\n \"acc_stderr\": 0.030701372111510923,\n\
\ \"acc_norm\": 0.2824074074074074,\n \"acc_norm_stderr\": 0.030701372111510923\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.31862745098039214,\n \"acc_stderr\": 0.032702871814820796,\n \"\
acc_norm\": 0.31862745098039214,\n \"acc_norm_stderr\": 0.032702871814820796\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.15246636771300448,\n\
\ \"acc_stderr\": 0.024126204813252877,\n \"acc_norm\": 0.15246636771300448,\n\
\ \"acc_norm_stderr\": 0.024126204813252877\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.21374045801526717,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.21374045801526717,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4214876033057851,\n \"acc_stderr\": 0.045077322787750944,\n \"\
acc_norm\": 0.4214876033057851,\n \"acc_norm_stderr\": 0.045077322787750944\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2037037037037037,\n\
\ \"acc_stderr\": 0.038935425188248496,\n \"acc_norm\": 0.2037037037037037,\n\
\ \"acc_norm_stderr\": 0.038935425188248496\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.17177914110429449,\n \"acc_stderr\": 0.029634717272371047,\n\
\ \"acc_norm\": 0.17177914110429449,\n \"acc_norm_stderr\": 0.029634717272371047\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.22321428571428573,\n\
\ \"acc_stderr\": 0.03952301967702511,\n \"acc_norm\": 0.22321428571428573,\n\
\ \"acc_norm_stderr\": 0.03952301967702511\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2815533980582524,\n \"acc_stderr\": 0.04453254836326469,\n\
\ \"acc_norm\": 0.2815533980582524,\n \"acc_norm_stderr\": 0.04453254836326469\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24358974358974358,\n\
\ \"acc_stderr\": 0.0281209665039144,\n \"acc_norm\": 0.24358974358974358,\n\
\ \"acc_norm_stderr\": 0.0281209665039144\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24904214559386972,\n\
\ \"acc_stderr\": 0.01546467616339596,\n \"acc_norm\": 0.24904214559386972,\n\
\ \"acc_norm_stderr\": 0.01546467616339596\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24277456647398843,\n \"acc_stderr\": 0.023083658586984204,\n\
\ \"acc_norm\": 0.24277456647398843,\n \"acc_norm_stderr\": 0.023083658586984204\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25921787709497207,\n\
\ \"acc_stderr\": 0.014655780837497698,\n \"acc_norm\": 0.25921787709497207,\n\
\ \"acc_norm_stderr\": 0.014655780837497698\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2875816993464052,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.2875816993464052,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.26366559485530544,\n\
\ \"acc_stderr\": 0.02502553850053234,\n \"acc_norm\": 0.26366559485530544,\n\
\ \"acc_norm_stderr\": 0.02502553850053234\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2839506172839506,\n \"acc_stderr\": 0.02508947852376513,\n\
\ \"acc_norm\": 0.2839506172839506,\n \"acc_norm_stderr\": 0.02508947852376513\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.26595744680851063,\n \"acc_stderr\": 0.02635806569888059,\n \
\ \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.02635806569888059\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26140808344198174,\n\
\ \"acc_stderr\": 0.011222528169771312,\n \"acc_norm\": 0.26140808344198174,\n\
\ \"acc_norm_stderr\": 0.011222528169771312\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.22794117647058823,\n \"acc_stderr\": 0.025483081468029804,\n\
\ \"acc_norm\": 0.22794117647058823,\n \"acc_norm_stderr\": 0.025483081468029804\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2565359477124183,\n \"acc_stderr\": 0.017667841612378984,\n \
\ \"acc_norm\": 0.2565359477124183,\n \"acc_norm_stderr\": 0.017667841612378984\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.35454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505415,\n \"acc_norm\": 0.35454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505415\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.028920583220675575,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.028920583220675575\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.03036049015401467,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.03036049015401467\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.20481927710843373,\n\
\ \"acc_stderr\": 0.03141784291663926,\n \"acc_norm\": 0.20481927710843373,\n\
\ \"acc_norm_stderr\": 0.03141784291663926\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.034462962170884265,\n\
\ \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.034462962170884265\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2178702570379437,\n\
\ \"mc1_stderr\": 0.014450846714123895,\n \"mc2\": 0.3504462244325371,\n\
\ \"mc2_stderr\": 0.01326389716056673\n }\n}\n```"
repo_url: https://huggingface.co/DanielSc4/RedPajama-INCITE-Chat-3B-v1-RL-LoRA-8bit-test1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|arc:challenge|25_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hellaswag|10_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T15:41:04.762236.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T15:41:04.762236.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T15:41:04.762236.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T15:41:04.762236.parquet'
- config_name: results
data_files:
- split: 2023_08_29T15_41_04.762236
path:
- results_2023-08-29T15:41:04.762236.parquet
- split: latest
path:
- results_2023-08-29T15:41:04.762236.parquet
---
# Dataset Card for Evaluation run of DanielSc4/RedPajama-INCITE-Chat-3B-v1-RL-LoRA-8bit-test1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/DanielSc4/RedPajama-INCITE-Chat-3B-v1-RL-LoRA-8bit-test1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [DanielSc4/RedPajama-INCITE-Chat-3B-v1-RL-LoRA-8bit-test1](https://huggingface.co/DanielSc4/RedPajama-INCITE-Chat-3B-v1-RL-LoRA-8bit-test1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DanielSc4__RedPajama-INCITE-Chat-3B-v1-RL-LoRA-8bit-test1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T15:41:04.762236](https://huggingface.co/datasets/open-llm-leaderboard/details_DanielSc4__RedPajama-INCITE-Chat-3B-v1-RL-LoRA-8bit-test1/blob/main/results_2023-08-29T15%3A41%3A04.762236.json):
```python
{
"all": {
"acc": 0.26680324768439967,
"acc_stderr": 0.03192521196777,
"acc_norm": 0.27043158188605154,
"acc_norm_stderr": 0.031924647823271375,
"mc1": 0.2178702570379437,
"mc1_stderr": 0.014450846714123895,
"mc2": 0.3504462244325371,
"mc2_stderr": 0.01326389716056673
},
"harness|arc:challenge|25": {
"acc": 0.3728668941979522,
"acc_stderr": 0.01413117676013116,
"acc_norm": 0.4129692832764505,
"acc_norm_stderr": 0.014388344935398324
},
"harness|hellaswag|10": {
"acc": 0.49422425811591314,
"acc_stderr": 0.004989448490164431,
"acc_norm": 0.6681935869348735,
"acc_norm_stderr": 0.004698995789478812
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.03749850709174022,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.03749850709174022
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2830188679245283,
"acc_stderr": 0.027724236492700904,
"acc_norm": 0.2830188679245283,
"acc_norm_stderr": 0.027724236492700904
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.03716177437566016,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.03716177437566016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206824,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206824
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.033450369167889925,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.033450369167889925
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.044405219061793254,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.044405219061793254
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2170212765957447,
"acc_stderr": 0.026947483121496228,
"acc_norm": 0.2170212765957447,
"acc_norm_stderr": 0.026947483121496228
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281336,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281336
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2275132275132275,
"acc_stderr": 0.021591269407823778,
"acc_norm": 0.2275132275132275,
"acc_norm_stderr": 0.021591269407823778
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.038095238095238106,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.038095238095238106
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2645161290322581,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.2645161290322581,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2315270935960591,
"acc_stderr": 0.029678333141444455,
"acc_norm": 0.2315270935960591,
"acc_norm_stderr": 0.029678333141444455
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.0340150671524904,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.0340150671524904
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.36363636363636365,
"acc_stderr": 0.034273086529999344,
"acc_norm": 0.36363636363636365,
"acc_norm_stderr": 0.034273086529999344
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2849740932642487,
"acc_stderr": 0.032577140777096614,
"acc_norm": 0.2849740932642487,
"acc_norm_stderr": 0.032577140777096614
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2692307692307692,
"acc_stderr": 0.02248938979365481,
"acc_norm": 0.2692307692307692,
"acc_norm_stderr": 0.02248938979365481
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24369747899159663,
"acc_stderr": 0.027886828078380572,
"acc_norm": 0.24369747899159663,
"acc_norm_stderr": 0.027886828078380572
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3321100917431193,
"acc_stderr": 0.020192682985423344,
"acc_norm": 0.3321100917431193,
"acc_norm_stderr": 0.020192682985423344
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2824074074074074,
"acc_stderr": 0.030701372111510923,
"acc_norm": 0.2824074074074074,
"acc_norm_stderr": 0.030701372111510923
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.31862745098039214,
"acc_stderr": 0.032702871814820796,
"acc_norm": 0.31862745098039214,
"acc_norm_stderr": 0.032702871814820796
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.15246636771300448,
"acc_stderr": 0.024126204813252877,
"acc_norm": 0.15246636771300448,
"acc_norm_stderr": 0.024126204813252877
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.21374045801526717,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.21374045801526717,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4214876033057851,
"acc_stderr": 0.045077322787750944,
"acc_norm": 0.4214876033057851,
"acc_norm_stderr": 0.045077322787750944
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.038935425188248496,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.038935425188248496
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.17177914110429449,
"acc_stderr": 0.029634717272371047,
"acc_norm": 0.17177914110429449,
"acc_norm_stderr": 0.029634717272371047
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.22321428571428573,
"acc_stderr": 0.03952301967702511,
"acc_norm": 0.22321428571428573,
"acc_norm_stderr": 0.03952301967702511
},
"harness|hendrycksTest-management|5": {
"acc": 0.2815533980582524,
"acc_stderr": 0.04453254836326469,
"acc_norm": 0.2815533980582524,
"acc_norm_stderr": 0.04453254836326469
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.0281209665039144,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.0281209665039144
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24904214559386972,
"acc_stderr": 0.01546467616339596,
"acc_norm": 0.24904214559386972,
"acc_norm_stderr": 0.01546467616339596
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.023083658586984204,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.023083658586984204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25921787709497207,
"acc_stderr": 0.014655780837497698,
"acc_norm": 0.25921787709497207,
"acc_norm_stderr": 0.014655780837497698
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2875816993464052,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.2875816993464052,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.26366559485530544,
"acc_stderr": 0.02502553850053234,
"acc_norm": 0.26366559485530544,
"acc_norm_stderr": 0.02502553850053234
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2839506172839506,
"acc_stderr": 0.02508947852376513,
"acc_norm": 0.2839506172839506,
"acc_norm_stderr": 0.02508947852376513
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.02635806569888059,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.02635806569888059
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26140808344198174,
"acc_stderr": 0.011222528169771312,
"acc_norm": 0.26140808344198174,
"acc_norm_stderr": 0.011222528169771312
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.22794117647058823,
"acc_stderr": 0.025483081468029804,
"acc_norm": 0.22794117647058823,
"acc_norm_stderr": 0.025483081468029804
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2565359477124183,
"acc_stderr": 0.017667841612378984,
"acc_norm": 0.2565359477124183,
"acc_norm_stderr": 0.017667841612378984
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.35454545454545455,
"acc_stderr": 0.04582004841505415,
"acc_norm": 0.35454545454545455,
"acc_norm_stderr": 0.04582004841505415
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.028920583220675575,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.028920583220675575
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401467,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401467
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-virology|5": {
"acc": 0.20481927710843373,
"acc_stderr": 0.03141784291663926,
"acc_norm": 0.20481927710843373,
"acc_norm_stderr": 0.03141784291663926
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2178702570379437,
"mc1_stderr": 0.014450846714123895,
"mc2": 0.3504462244325371,
"mc2_stderr": 0.01326389716056673
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
tyzhu/fwv2_random_rare_train_100_eval_100 | 2023-08-29T16:09:39.000Z | [
"region:us"
] | tyzhu | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 30323
num_examples: 300
- name: train_doc2id
num_bytes: 18225
num_examples: 200
- name: train_id2doc
num_bytes: 18825
num_examples: 200
- name: train_find_word
num_bytes: 11498
num_examples: 100
- name: eval_find_word
num_bytes: 11344
num_examples: 100
- name: id_context_mapping
num_bytes: 12425
num_examples: 200
download_size: 0
dataset_size: 102640
---
# Dataset Card for "fwv2_random_rare_train_100_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/fwv2_random_rare_train_1000_eval_100 | 2023-08-29T16:12:04.000Z | [
"region:us"
] | tyzhu | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 218225
num_examples: 2100
- name: train_doc2id
num_bytes: 100243
num_examples: 1100
- name: train_id2doc
num_bytes: 103543
num_examples: 1100
- name: train_find_word
num_bytes: 114682
num_examples: 1000
- name: eval_find_word
num_bytes: 11342
num_examples: 100
- name: id_context_mapping
num_bytes: 68343
num_examples: 1100
download_size: 0
dataset_size: 616378
---
# Dataset Card for "fwv2_random_rare_train_1000_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/fwv2_squad_rare_train_10_eval_10 | 2023-08-29T16:14:49.000Z | [
"region:us"
] | tyzhu | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 4812
num_examples: 30
- name: train_doc2id
num_bytes: 3534
num_examples: 20
- name: train_id2doc
num_bytes: 3594
num_examples: 20
- name: train_find_word
num_bytes: 1218
num_examples: 10
- name: eval_find_word
num_bytes: 1174
num_examples: 10
- name: id_context_mapping
num_bytes: 2954
num_examples: 20
download_size: 25115
dataset_size: 17286
---
# Dataset Card for "fwv2_squad_rare_train_10_eval_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_IkariDev__Athena-tmp | 2023-08-29T15:52:05.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of IkariDev/Athena-tmp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [IkariDev/Athena-tmp](https://huggingface.co/IkariDev/Athena-tmp) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_IkariDev__Athena-tmp\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T15:50:42.106753](https://huggingface.co/datasets/open-llm-leaderboard/details_IkariDev__Athena-tmp/blob/main/results_2023-08-29T15%3A50%3A42.106753.json):\n\
\n```python\n{\n \"all\": {\n \"acc\": 0.5888874553745688,\n \"\
acc_stderr\": 0.03407664559390293,\n \"acc_norm\": 0.5926858740874733,\n\
\ \"acc_norm_stderr\": 0.034057449595187576,\n \"mc1\": 0.38922888616891066,\n\
\ \"mc1_stderr\": 0.017068552680690328,\n \"mc2\": 0.5536706803409501,\n\
\ \"mc2_stderr\": 0.01611557269809252\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5674061433447098,\n \"acc_stderr\": 0.014478005694182531,\n\
\ \"acc_norm\": 0.5921501706484642,\n \"acc_norm_stderr\": 0.014361097288449696\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6218880701055567,\n\
\ \"acc_stderr\": 0.004839247332606038,\n \"acc_norm\": 0.8212507468631747,\n\
\ \"acc_norm_stderr\": 0.003823591814133031\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849726,\n\
\ \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849726\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.030052580579557845,\n\
\ \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.030052580579557845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.03800968060554859,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.03800968060554859\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.502127659574468,\n \"acc_stderr\": 0.03268572658667492,\n\
\ \"acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.03268572658667492\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35185185185185186,\n \"acc_stderr\": 0.024594975128920938,\n \"\
acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.024594975128920938\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6451612903225806,\n \"acc_stderr\": 0.02721888977330876,\n \"\
acc_norm\": 0.6451612903225806,\n \"acc_norm_stderr\": 0.02721888977330876\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.458128078817734,\n \"acc_stderr\": 0.03505630140785742,\n \"acc_norm\"\
: 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785742\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.033175059300091805,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.033175059300091805\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338642,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338642\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723875,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723875\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6205128205128205,\n \"acc_stderr\": 0.024603626924097417,\n\
\ \"acc_norm\": 0.6205128205128205,\n \"acc_norm_stderr\": 0.024603626924097417\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616258,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616258\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6134453781512605,\n \"acc_stderr\": 0.03163145807552378,\n \
\ \"acc_norm\": 0.6134453781512605,\n \"acc_norm_stderr\": 0.03163145807552378\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7963302752293578,\n \"acc_stderr\": 0.017266742087630804,\n \"\
acc_norm\": 0.7963302752293578,\n \"acc_norm_stderr\": 0.017266742087630804\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4212962962962963,\n \"acc_stderr\": 0.03367462138896078,\n \"\
acc_norm\": 0.4212962962962963,\n \"acc_norm_stderr\": 0.03367462138896078\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.02732547096671632,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.02732547096671632\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.041032038305145124,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.041032038305145124\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n\
\ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.02514093595033545,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.02514093595033545\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7790549169859514,\n\
\ \"acc_stderr\": 0.014836205167333567,\n \"acc_norm\": 0.7790549169859514,\n\
\ \"acc_norm_stderr\": 0.014836205167333567\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6445086705202312,\n \"acc_stderr\": 0.025770292082977247,\n\
\ \"acc_norm\": 0.6445086705202312,\n \"acc_norm_stderr\": 0.025770292082977247\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.44692737430167595,\n\
\ \"acc_stderr\": 0.01662803003964761,\n \"acc_norm\": 0.44692737430167595,\n\
\ \"acc_norm_stderr\": 0.01662803003964761\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.027634176689602656,\n\
\ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.027634176689602656\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.026236965881153262,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.026236965881153262\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.025630824975621344,\n\
\ \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.025630824975621344\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.02968010556502904,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.02968010556502904\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46284224250325945,\n\
\ \"acc_stderr\": 0.012734923579532063,\n \"acc_norm\": 0.46284224250325945,\n\
\ \"acc_norm_stderr\": 0.012734923579532063\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5698529411764706,\n \"acc_stderr\": 0.030074971917302875,\n\
\ \"acc_norm\": 0.5698529411764706,\n \"acc_norm_stderr\": 0.030074971917302875\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5882352941176471,\n \"acc_stderr\": 0.019910377463105935,\n \
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.019910377463105935\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505417,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505417\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726492,\n\
\ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726492\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7213930348258707,\n\
\ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.7213930348258707,\n\
\ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38922888616891066,\n\
\ \"mc1_stderr\": 0.017068552680690328,\n \"mc2\": 0.5536706803409501,\n\
\ \"mc2_stderr\": 0.01611557269809252\n }\n}\n```"
repo_url: https://huggingface.co/IkariDev/Athena-tmp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|arc:challenge|25_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hellaswag|10_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T15:50:42.106753.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T15:50:42.106753.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T15:50:42.106753.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T15:50:42.106753.parquet'
- config_name: results
data_files:
- split: 2023_08_29T15_50_42.106753
path:
- results_2023-08-29T15:50:42.106753.parquet
- split: latest
path:
- results_2023-08-29T15:50:42.106753.parquet
---
# Dataset Card for Evaluation run of IkariDev/Athena-tmp
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/IkariDev/Athena-tmp
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [IkariDev/Athena-tmp](https://huggingface.co/IkariDev/Athena-tmp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_IkariDev__Athena-tmp",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T15:50:42.106753](https://huggingface.co/datasets/open-llm-leaderboard/details_IkariDev__Athena-tmp/blob/main/results_2023-08-29T15%3A50%3A42.106753.json):
```python
{
"all": {
"acc": 0.5888874553745688,
"acc_stderr": 0.03407664559390293,
"acc_norm": 0.5926858740874733,
"acc_norm_stderr": 0.034057449595187576,
"mc1": 0.38922888616891066,
"mc1_stderr": 0.017068552680690328,
"mc2": 0.5536706803409501,
"mc2_stderr": 0.01611557269809252
},
"harness|arc:challenge|25": {
"acc": 0.5674061433447098,
"acc_stderr": 0.014478005694182531,
"acc_norm": 0.5921501706484642,
"acc_norm_stderr": 0.014361097288449696
},
"harness|hellaswag|10": {
"acc": 0.6218880701055567,
"acc_stderr": 0.004839247332606038,
"acc_norm": 0.8212507468631747,
"acc_norm_stderr": 0.003823591814133031
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.03910525752849726,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.03910525752849726
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.030052580579557845,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.030052580579557845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.03800968060554859,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.03800968060554859
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.502127659574468,
"acc_stderr": 0.03268572658667492,
"acc_norm": 0.502127659574468,
"acc_norm_stderr": 0.03268572658667492
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.024594975128920938,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.024594975128920938
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6451612903225806,
"acc_stderr": 0.02721888977330876,
"acc_norm": 0.6451612903225806,
"acc_norm_stderr": 0.02721888977330876
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785742,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785742
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.033175059300091805,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.033175059300091805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338642,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338642
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.025787723180723875,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.025787723180723875
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6205128205128205,
"acc_stderr": 0.024603626924097417,
"acc_norm": 0.6205128205128205,
"acc_norm_stderr": 0.024603626924097417
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616258,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616258
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6134453781512605,
"acc_stderr": 0.03163145807552378,
"acc_norm": 0.6134453781512605,
"acc_norm_stderr": 0.03163145807552378
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7963302752293578,
"acc_stderr": 0.017266742087630804,
"acc_norm": 0.7963302752293578,
"acc_norm_stderr": 0.017266742087630804
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4212962962962963,
"acc_stderr": 0.03367462138896078,
"acc_norm": 0.4212962962962963,
"acc_norm_stderr": 0.03367462138896078
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.02732547096671632,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.02732547096671632
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.041032038305145124,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.041032038305145124
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.02514093595033545,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.02514093595033545
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7790549169859514,
"acc_stderr": 0.014836205167333567,
"acc_norm": 0.7790549169859514,
"acc_norm_stderr": 0.014836205167333567
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6445086705202312,
"acc_stderr": 0.025770292082977247,
"acc_norm": 0.6445086705202312,
"acc_norm_stderr": 0.025770292082977247
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.44692737430167595,
"acc_stderr": 0.01662803003964761,
"acc_norm": 0.44692737430167595,
"acc_norm_stderr": 0.01662803003964761
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.027634176689602656,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.027634176689602656
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153262,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153262
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.025630824975621344,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.025630824975621344
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.02968010556502904,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.02968010556502904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46284224250325945,
"acc_stderr": 0.012734923579532063,
"acc_norm": 0.46284224250325945,
"acc_norm_stderr": 0.012734923579532063
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5698529411764706,
"acc_stderr": 0.030074971917302875,
"acc_norm": 0.5698529411764706,
"acc_norm_stderr": 0.030074971917302875
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.019910377463105935,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.019910377463105935
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.04582004841505417,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.04582004841505417
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726492,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726492
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7213930348258707,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.7213930348258707,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.38922888616891066,
"mc1_stderr": 0.017068552680690328,
"mc2": 0.5536706803409501,
"mc2_stderr": 0.01611557269809252
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
tyzhu/fwv2_squad_rare_train_100_eval_100 | 2023-08-29T16:17:39.000Z | [
"region:us"
] | tyzhu | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 48776
num_examples: 300
- name: train_doc2id
num_bytes: 35788
num_examples: 200
- name: train_id2doc
num_bytes: 36388
num_examples: 200
- name: train_find_word
num_bytes: 12388
num_examples: 100
- name: eval_find_word
num_bytes: 11774
num_examples: 100
- name: id_context_mapping
num_bytes: 29988
num_examples: 200
download_size: 115703
dataset_size: 175102
---
# Dataset Card for "fwv2_squad_rare_train_100_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/fwv2_squad_rare_train_1000_eval_100 | 2023-08-29T16:20:26.000Z | [
"region:us"
] | tyzhu | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 321689
num_examples: 2100
- name: train_doc2id
num_bytes: 195355
num_examples: 1100
- name: train_id2doc
num_bytes: 198655
num_examples: 1100
- name: train_find_word
num_bytes: 123034
num_examples: 1000
- name: eval_find_word
num_bytes: 11763
num_examples: 100
- name: id_context_mapping
num_bytes: 163455
num_examples: 1100
download_size: 576167
dataset_size: 1013951
---
# Dataset Card for "fwv2_squad_rare_train_1000_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NobodyExistsOnTheInternet/GiftedConvoBeforeEcons | 2023-08-29T15:58:23.000Z | [
"license:mit",
"region:us"
] | NobodyExistsOnTheInternet | null | null | null | 0 | 0 | ---
license: mit
---
|
vitaliy-sharandin/ai-incidents | 2023-09-05T23:36:42.000Z | [
"region:us"
] | vitaliy-sharandin | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: _id
dtype: string
- name: incident_id
dtype: int64
- name: date
dtype: timestamp[ns]
- name: reports
dtype: string
- name: Alleged deployer of AI system
dtype: string
- name: Alleged developer of AI system
dtype: string
- name: Alleged harmed or nearly harmed parties
dtype: string
- name: description
dtype: string
- name: title
dtype: string
- name: year
dtype: int64
- name: spacy_negative_outcomes
dtype: string
- name: keybert_negative_outcomes
dtype: string
- name: Cluster
dtype: string
splits:
- name: train
num_bytes: 271118
num_examples: 514
download_size: 165345
dataset_size: 271118
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ai-incidents"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Touretticat/repo_name | 2023-08-29T16:04:51.000Z | [
"region:us"
] | Touretticat | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_Writer__InstructPalmyra-20b | 2023-08-29T16:06:02.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Writer/InstructPalmyra-20b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Writer/InstructPalmyra-20b](https://huggingface.co/Writer/InstructPalmyra-20b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Writer__InstructPalmyra-20b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T16:04:46.105936](https://huggingface.co/datasets/open-llm-leaderboard/details_Writer__InstructPalmyra-20b/blob/main/results_2023-08-29T16%3A04%3A46.105936.json):\n\
\n```python\n{\n \"all\": {\n \"acc\": 0.2902000987354527,\n \"\
acc_stderr\": 0.032616340491924696,\n \"acc_norm\": 0.2934157939493543,\n\
\ \"acc_norm_stderr\": 0.032607944078258226,\n \"mc1\": 0.2717258261933905,\n\
\ \"mc1_stderr\": 0.01557284045287583,\n \"mc2\": 0.41814010073968827,\n\
\ \"mc2_stderr\": 0.015504016693772268\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.45563139931740615,\n \"acc_stderr\": 0.014553749939306864,\n\
\ \"acc_norm\": 0.4709897610921502,\n \"acc_norm_stderr\": 0.01458677635529432\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5556662019518024,\n\
\ \"acc_stderr\": 0.004958761056959782,\n \"acc_norm\": 0.7300338577972515,\n\
\ \"acc_norm_stderr\": 0.004430346234650379\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.034597776068105365,\n\
\ \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.034597776068105365\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n\
\ \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.29056603773584905,\n \"acc_stderr\": 0.027943219989337152,\n\
\ \"acc_norm\": 0.29056603773584905,\n \"acc_norm_stderr\": 0.027943219989337152\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.17,\n\
\ \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n \
\ \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.17341040462427745,\n\
\ \"acc_stderr\": 0.02886810787497064,\n \"acc_norm\": 0.17341040462427745,\n\
\ \"acc_norm_stderr\": 0.02886810787497064\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.03873958714149351,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.03873958714149351\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.31063829787234043,\n \"acc_stderr\": 0.03025123757921317,\n\
\ \"acc_norm\": 0.31063829787234043,\n \"acc_norm_stderr\": 0.03025123757921317\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489361,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489361\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2896551724137931,\n \"acc_stderr\": 0.03780019230438014,\n\
\ \"acc_norm\": 0.2896551724137931,\n \"acc_norm_stderr\": 0.03780019230438014\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2724867724867725,\n \"acc_stderr\": 0.02293097307163336,\n \"\
acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.02293097307163336\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03670066451047181,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03670066451047181\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.2870967741935484,\n \"acc_stderr\": 0.025736542745594528,\n \"\
acc_norm\": 0.2870967741935484,\n \"acc_norm_stderr\": 0.025736542745594528\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3103448275862069,\n \"acc_stderr\": 0.032550867699701024,\n \"\
acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.032550867699701024\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3212121212121212,\n \"acc_stderr\": 0.036462049632538115,\n\
\ \"acc_norm\": 0.3212121212121212,\n \"acc_norm_stderr\": 0.036462049632538115\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.30303030303030304,\n \"acc_stderr\": 0.03274287914026868,\n \"\
acc_norm\": 0.30303030303030304,\n \"acc_norm_stderr\": 0.03274287914026868\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.25906735751295334,\n \"acc_stderr\": 0.03161877917935411,\n\
\ \"acc_norm\": 0.25906735751295334,\n \"acc_norm_stderr\": 0.03161877917935411\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.24358974358974358,\n \"acc_stderr\": 0.021763733684173933,\n\
\ \"acc_norm\": 0.24358974358974358,\n \"acc_norm_stderr\": 0.021763733684173933\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24444444444444444,\n \"acc_stderr\": 0.02620276653465215,\n \
\ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.02620276653465215\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.028657491285071973,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.028657491285071973\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2185430463576159,\n \"acc_stderr\": 0.03374235550425694,\n \"\
acc_norm\": 0.2185430463576159,\n \"acc_norm_stderr\": 0.03374235550425694\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.27155963302752295,\n \"acc_stderr\": 0.019069098363191445,\n \"\
acc_norm\": 0.27155963302752295,\n \"acc_norm_stderr\": 0.019069098363191445\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2175925925925926,\n \"acc_stderr\": 0.028139689444859676,\n \"\
acc_norm\": 0.2175925925925926,\n \"acc_norm_stderr\": 0.028139689444859676\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.31862745098039214,\n \"acc_stderr\": 0.03270287181482081,\n \"\
acc_norm\": 0.31862745098039214,\n \"acc_norm_stderr\": 0.03270287181482081\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.34177215189873417,\n \"acc_stderr\": 0.030874537537553617,\n \
\ \"acc_norm\": 0.34177215189873417,\n \"acc_norm_stderr\": 0.030874537537553617\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.40358744394618834,\n\
\ \"acc_stderr\": 0.03292802819330314,\n \"acc_norm\": 0.40358744394618834,\n\
\ \"acc_norm_stderr\": 0.03292802819330314\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728745,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728745\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.371900826446281,\n \"acc_stderr\": 0.04412015806624504,\n \"acc_norm\"\
: 0.371900826446281,\n \"acc_norm_stderr\": 0.04412015806624504\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.32407407407407407,\n\
\ \"acc_stderr\": 0.045245960070300476,\n \"acc_norm\": 0.32407407407407407,\n\
\ \"acc_norm_stderr\": 0.045245960070300476\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.31901840490797545,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.31901840490797545,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04287858751340456,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04287858751340456\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.34951456310679613,\n \"acc_stderr\": 0.047211885060971716,\n\
\ \"acc_norm\": 0.34951456310679613,\n \"acc_norm_stderr\": 0.047211885060971716\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.358974358974359,\n\
\ \"acc_stderr\": 0.031426169937919225,\n \"acc_norm\": 0.358974358974359,\n\
\ \"acc_norm_stderr\": 0.031426169937919225\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.3371647509578544,\n\
\ \"acc_stderr\": 0.016905207420803547,\n \"acc_norm\": 0.3371647509578544,\n\
\ \"acc_norm_stderr\": 0.016905207420803547\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3179190751445087,\n \"acc_stderr\": 0.025070713719153186,\n\
\ \"acc_norm\": 0.3179190751445087,\n \"acc_norm_stderr\": 0.025070713719153186\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808836,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808836\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3006535947712418,\n \"acc_stderr\": 0.026256053835718964,\n\
\ \"acc_norm\": 0.3006535947712418,\n \"acc_norm_stderr\": 0.026256053835718964\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3665594855305466,\n\
\ \"acc_stderr\": 0.02736807824397163,\n \"acc_norm\": 0.3665594855305466,\n\
\ \"acc_norm_stderr\": 0.02736807824397163\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2839506172839506,\n \"acc_stderr\": 0.02508947852376513,\n\
\ \"acc_norm\": 0.2839506172839506,\n \"acc_norm_stderr\": 0.02508947852376513\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2765957446808511,\n \"acc_stderr\": 0.02668456434046099,\n \
\ \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.02668456434046099\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27249022164276404,\n\
\ \"acc_stderr\": 0.011371658294311532,\n \"acc_norm\": 0.27249022164276404,\n\
\ \"acc_norm_stderr\": 0.011371658294311532\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.21323529411764705,\n \"acc_stderr\": 0.024880971512294292,\n\
\ \"acc_norm\": 0.21323529411764705,\n \"acc_norm_stderr\": 0.024880971512294292\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.29411764705882354,\n \"acc_stderr\": 0.018433427649401896,\n \
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.018433427649401896\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.23265306122448978,\n \"acc_stderr\": 0.02704925791589618,\n\
\ \"acc_norm\": 0.23265306122448978,\n \"acc_norm_stderr\": 0.02704925791589618\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3034825870646766,\n\
\ \"acc_stderr\": 0.03251006816458617,\n \"acc_norm\": 0.3034825870646766,\n\
\ \"acc_norm_stderr\": 0.03251006816458617\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3373493975903614,\n\
\ \"acc_stderr\": 0.03680783690727581,\n \"acc_norm\": 0.3373493975903614,\n\
\ \"acc_norm_stderr\": 0.03680783690727581\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.033014059469872514,\n\
\ \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.033014059469872514\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2717258261933905,\n\
\ \"mc1_stderr\": 0.01557284045287583,\n \"mc2\": 0.41814010073968827,\n\
\ \"mc2_stderr\": 0.015504016693772268\n }\n}\n```"
repo_url: https://huggingface.co/Writer/InstructPalmyra-20b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|arc:challenge|25_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hellaswag|10_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T16:04:46.105936.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T16:04:46.105936.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T16:04:46.105936.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T16:04:46.105936.parquet'
- config_name: results
data_files:
- split: 2023_08_29T16_04_46.105936
path:
- results_2023-08-29T16:04:46.105936.parquet
- split: latest
path:
- results_2023-08-29T16:04:46.105936.parquet
---
# Dataset Card for Evaluation run of Writer/InstructPalmyra-20b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Writer/InstructPalmyra-20b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Writer/InstructPalmyra-20b](https://huggingface.co/Writer/InstructPalmyra-20b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Writer__InstructPalmyra-20b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T16:04:46.105936](https://huggingface.co/datasets/open-llm-leaderboard/details_Writer__InstructPalmyra-20b/blob/main/results_2023-08-29T16%3A04%3A46.105936.json):
```python
{
"all": {
"acc": 0.2902000987354527,
"acc_stderr": 0.032616340491924696,
"acc_norm": 0.2934157939493543,
"acc_norm_stderr": 0.032607944078258226,
"mc1": 0.2717258261933905,
"mc1_stderr": 0.01557284045287583,
"mc2": 0.41814010073968827,
"mc2_stderr": 0.015504016693772268
},
"harness|arc:challenge|25": {
"acc": 0.45563139931740615,
"acc_stderr": 0.014553749939306864,
"acc_norm": 0.4709897610921502,
"acc_norm_stderr": 0.01458677635529432
},
"harness|hellaswag|10": {
"acc": 0.5556662019518024,
"acc_stderr": 0.004958761056959782,
"acc_norm": 0.7300338577972515,
"acc_norm_stderr": 0.004430346234650379
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.034597776068105365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.034597776068105365
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.29056603773584905,
"acc_stderr": 0.027943219989337152,
"acc_norm": 0.29056603773584905,
"acc_norm_stderr": 0.027943219989337152
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3125,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.17341040462427745,
"acc_stderr": 0.02886810787497064,
"acc_norm": 0.17341040462427745,
"acc_norm_stderr": 0.02886810787497064
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.03873958714149351,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.03873958714149351
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.31063829787234043,
"acc_stderr": 0.03025123757921317,
"acc_norm": 0.31063829787234043,
"acc_norm_stderr": 0.03025123757921317
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489361,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489361
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2896551724137931,
"acc_stderr": 0.03780019230438014,
"acc_norm": 0.2896551724137931,
"acc_norm_stderr": 0.03780019230438014
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2724867724867725,
"acc_stderr": 0.02293097307163336,
"acc_norm": 0.2724867724867725,
"acc_norm_stderr": 0.02293097307163336
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047181,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047181
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2870967741935484,
"acc_stderr": 0.025736542745594528,
"acc_norm": 0.2870967741935484,
"acc_norm_stderr": 0.025736542745594528
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.032550867699701024,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.032550867699701024
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3212121212121212,
"acc_stderr": 0.036462049632538115,
"acc_norm": 0.3212121212121212,
"acc_norm_stderr": 0.036462049632538115
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.30303030303030304,
"acc_stderr": 0.03274287914026868,
"acc_norm": 0.30303030303030304,
"acc_norm_stderr": 0.03274287914026868
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.25906735751295334,
"acc_stderr": 0.03161877917935411,
"acc_norm": 0.25906735751295334,
"acc_norm_stderr": 0.03161877917935411
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.021763733684173933,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.021763733684173933
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.02620276653465215,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.02620276653465215
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.028657491285071973,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.028657491285071973
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2185430463576159,
"acc_stderr": 0.03374235550425694,
"acc_norm": 0.2185430463576159,
"acc_norm_stderr": 0.03374235550425694
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.27155963302752295,
"acc_stderr": 0.019069098363191445,
"acc_norm": 0.27155963302752295,
"acc_norm_stderr": 0.019069098363191445
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2175925925925926,
"acc_stderr": 0.028139689444859676,
"acc_norm": 0.2175925925925926,
"acc_norm_stderr": 0.028139689444859676
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.31862745098039214,
"acc_stderr": 0.03270287181482081,
"acc_norm": 0.31862745098039214,
"acc_norm_stderr": 0.03270287181482081
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.34177215189873417,
"acc_stderr": 0.030874537537553617,
"acc_norm": 0.34177215189873417,
"acc_norm_stderr": 0.030874537537553617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.40358744394618834,
"acc_stderr": 0.03292802819330314,
"acc_norm": 0.40358744394618834,
"acc_norm_stderr": 0.03292802819330314
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728745,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728745
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.371900826446281,
"acc_stderr": 0.04412015806624504,
"acc_norm": 0.371900826446281,
"acc_norm_stderr": 0.04412015806624504
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.045245960070300476,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.045245960070300476
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.31901840490797545,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.31901840490797545,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340456,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340456
},
"harness|hendrycksTest-management|5": {
"acc": 0.34951456310679613,
"acc_stderr": 0.047211885060971716,
"acc_norm": 0.34951456310679613,
"acc_norm_stderr": 0.047211885060971716
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.358974358974359,
"acc_stderr": 0.031426169937919225,
"acc_norm": 0.358974358974359,
"acc_norm_stderr": 0.031426169937919225
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.3371647509578544,
"acc_stderr": 0.016905207420803547,
"acc_norm": 0.3371647509578544,
"acc_norm_stderr": 0.016905207420803547
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3179190751445087,
"acc_stderr": 0.025070713719153186,
"acc_norm": 0.3179190751445087,
"acc_norm_stderr": 0.025070713719153186
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808836,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808836
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3006535947712418,
"acc_stderr": 0.026256053835718964,
"acc_norm": 0.3006535947712418,
"acc_norm_stderr": 0.026256053835718964
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3665594855305466,
"acc_stderr": 0.02736807824397163,
"acc_norm": 0.3665594855305466,
"acc_norm_stderr": 0.02736807824397163
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2839506172839506,
"acc_stderr": 0.02508947852376513,
"acc_norm": 0.2839506172839506,
"acc_norm_stderr": 0.02508947852376513
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.02668456434046099,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.02668456434046099
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.27249022164276404,
"acc_stderr": 0.011371658294311532,
"acc_norm": 0.27249022164276404,
"acc_norm_stderr": 0.011371658294311532
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.21323529411764705,
"acc_stderr": 0.024880971512294292,
"acc_norm": 0.21323529411764705,
"acc_norm_stderr": 0.024880971512294292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.018433427649401896,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.018433427649401896
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.23265306122448978,
"acc_stderr": 0.02704925791589618,
"acc_norm": 0.23265306122448978,
"acc_norm_stderr": 0.02704925791589618
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.3034825870646766,
"acc_stderr": 0.03251006816458617,
"acc_norm": 0.3034825870646766,
"acc_norm_stderr": 0.03251006816458617
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3373493975903614,
"acc_stderr": 0.03680783690727581,
"acc_norm": 0.3373493975903614,
"acc_norm_stderr": 0.03680783690727581
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.033014059469872514,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.033014059469872514
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2717258261933905,
"mc1_stderr": 0.01557284045287583,
"mc2": 0.41814010073968827,
"mc2_stderr": 0.015504016693772268
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16 | 2023-08-29T21:14:27.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of dhmeltzer/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [dhmeltzer/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16](https://huggingface.co/dhmeltzer/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T21:13:05.130565](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16/blob/main/results_2023-08-29T21%3A13%3A05.130565.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4690258680406254,\n\
\ \"acc_stderr\": 0.03526595221704812,\n \"acc_norm\": 0.47314486143663886,\n\
\ \"acc_norm_stderr\": 0.03525068980912334,\n \"mc1\": 0.28151774785801714,\n\
\ \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.43311675077794465,\n\
\ \"mc2_stderr\": 0.014085064609011494\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.49829351535836175,\n \"acc_stderr\": 0.014611305705056995,\n\
\ \"acc_norm\": 0.5409556313993175,\n \"acc_norm_stderr\": 0.014562291073601227\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5910177255526787,\n\
\ \"acc_stderr\": 0.00490641198447679,\n \"acc_norm\": 0.7913762198765186,\n\
\ \"acc_norm_stderr\": 0.004054944548370489\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4276315789473684,\n \"acc_stderr\": 0.04026097083296558,\n\
\ \"acc_norm\": 0.4276315789473684,\n \"acc_norm_stderr\": 0.04026097083296558\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.46037735849056605,\n \"acc_stderr\": 0.030676096599389184,\n\
\ \"acc_norm\": 0.46037735849056605,\n \"acc_norm_stderr\": 0.030676096599389184\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04155319955593146,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04155319955593146\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3988439306358382,\n\
\ \"acc_stderr\": 0.03733626655383509,\n \"acc_norm\": 0.3988439306358382,\n\
\ \"acc_norm_stderr\": 0.03733626655383509\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.03793281185307809,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.03793281185307809\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.032232762667117124,\n\
\ \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.032232762667117124\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.043391383225798615,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.043391383225798615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4206896551724138,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.4206896551724138,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113946,\n \"\
acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113946\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127155,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127155\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5096774193548387,\n\
\ \"acc_stderr\": 0.02843867799890955,\n \"acc_norm\": 0.5096774193548387,\n\
\ \"acc_norm_stderr\": 0.02843867799890955\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.33497536945812806,\n \"acc_stderr\": 0.033208527423483104,\n\
\ \"acc_norm\": 0.33497536945812806,\n \"acc_norm_stderr\": 0.033208527423483104\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6121212121212121,\n \"acc_stderr\": 0.038049136539710114,\n\
\ \"acc_norm\": 0.6121212121212121,\n \"acc_norm_stderr\": 0.038049136539710114\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.48484848484848486,\n \"acc_stderr\": 0.0356071651653106,\n \"\
acc_norm\": 0.48484848484848486,\n \"acc_norm_stderr\": 0.0356071651653106\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6683937823834197,\n \"acc_stderr\": 0.03397636541089118,\n\
\ \"acc_norm\": 0.6683937823834197,\n \"acc_norm_stderr\": 0.03397636541089118\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.45384615384615384,\n \"acc_stderr\": 0.025242770987126177,\n\
\ \"acc_norm\": 0.45384615384615384,\n \"acc_norm_stderr\": 0.025242770987126177\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945284,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945284\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.42016806722689076,\n \"acc_stderr\": 0.03206183783236152,\n\
\ \"acc_norm\": 0.42016806722689076,\n \"acc_norm_stderr\": 0.03206183783236152\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.03734535676787198,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.03734535676787198\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6256880733944954,\n \"acc_stderr\": 0.020748959408988306,\n \"\
acc_norm\": 0.6256880733944954,\n \"acc_norm_stderr\": 0.020748959408988306\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.28703703703703703,\n \"acc_stderr\": 0.030851992993257013,\n \"\
acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.030851992993257013\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5245098039215687,\n \"acc_stderr\": 0.03505093194348798,\n \"\
acc_norm\": 0.5245098039215687,\n \"acc_norm_stderr\": 0.03505093194348798\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5949367088607594,\n \"acc_stderr\": 0.03195514741370671,\n \
\ \"acc_norm\": 0.5949367088607594,\n \"acc_norm_stderr\": 0.03195514741370671\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5515695067264574,\n\
\ \"acc_stderr\": 0.03337883736255098,\n \"acc_norm\": 0.5515695067264574,\n\
\ \"acc_norm_stderr\": 0.03337883736255098\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5725190839694656,\n \"acc_stderr\": 0.04338920305792401,\n\
\ \"acc_norm\": 0.5725190839694656,\n \"acc_norm_stderr\": 0.04338920305792401\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6115702479338843,\n \"acc_stderr\": 0.04449270350068383,\n \"\
acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.04449270350068383\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5092592592592593,\n\
\ \"acc_stderr\": 0.04832853553437056,\n \"acc_norm\": 0.5092592592592593,\n\
\ \"acc_norm_stderr\": 0.04832853553437056\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.48466257668711654,\n \"acc_stderr\": 0.039265223787088424,\n\
\ \"acc_norm\": 0.48466257668711654,\n \"acc_norm_stderr\": 0.039265223787088424\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.046355501356099754,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.046355501356099754\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5825242718446602,\n \"acc_stderr\": 0.048828405482122375,\n\
\ \"acc_norm\": 0.5825242718446602,\n \"acc_norm_stderr\": 0.048828405482122375\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6923076923076923,\n\
\ \"acc_stderr\": 0.03023638994217308,\n \"acc_norm\": 0.6923076923076923,\n\
\ \"acc_norm_stderr\": 0.03023638994217308\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6424010217113666,\n\
\ \"acc_stderr\": 0.01713948899880328,\n \"acc_norm\": 0.6424010217113666,\n\
\ \"acc_norm_stderr\": 0.01713948899880328\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5028901734104047,\n \"acc_stderr\": 0.026918645383239004,\n\
\ \"acc_norm\": 0.5028901734104047,\n \"acc_norm_stderr\": 0.026918645383239004\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4934640522875817,\n \"acc_stderr\": 0.028627470550556047,\n\
\ \"acc_norm\": 0.4934640522875817,\n \"acc_norm_stderr\": 0.028627470550556047\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.594855305466238,\n\
\ \"acc_stderr\": 0.027882383791325953,\n \"acc_norm\": 0.594855305466238,\n\
\ \"acc_norm_stderr\": 0.027882383791325953\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.027815973433878014,\n\
\ \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.027815973433878014\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3617021276595745,\n \"acc_stderr\": 0.028663820147199492,\n \
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.028663820147199492\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3559322033898305,\n\
\ \"acc_stderr\": 0.012228645537277566,\n \"acc_norm\": 0.3559322033898305,\n\
\ \"acc_norm_stderr\": 0.012228645537277566\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.030320243265004144,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.030320243265004144\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4411764705882353,\n \"acc_stderr\": 0.02008736207670286,\n \
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.02008736207670286\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\
\ \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n\
\ \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5265306122448979,\n \"acc_stderr\": 0.03196412734523272,\n\
\ \"acc_norm\": 0.5265306122448979,\n \"acc_norm_stderr\": 0.03196412734523272\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6417910447761194,\n\
\ \"acc_stderr\": 0.03390393042268813,\n \"acc_norm\": 0.6417910447761194,\n\
\ \"acc_norm_stderr\": 0.03390393042268813\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.03467826685703826,\n\
\ \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.03467826685703826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28151774785801714,\n\
\ \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.43311675077794465,\n\
\ \"mc2_stderr\": 0.014085064609011494\n }\n}\n```"
repo_url: https://huggingface.co/dhmeltzer/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|arc:challenge|25_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|arc:challenge|25_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hellaswag|10_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hellaswag|10_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T16:04:44.471378.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T21:13:05.130565.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T21:13:05.130565.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T16:04:44.471378.parquet'
- split: 2023_08_29T21_13_05.130565
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T21:13:05.130565.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T21:13:05.130565.parquet'
- config_name: results
data_files:
- split: 2023_08_29T16_04_44.471378
path:
- results_2023-08-29T16:04:44.471378.parquet
- split: 2023_08_29T21_13_05.130565
path:
- results_2023-08-29T21:13:05.130565.parquet
- split: latest
path:
- results_2023-08-29T21:13:05.130565.parquet
---
# Dataset Card for Evaluation run of dhmeltzer/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/dhmeltzer/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [dhmeltzer/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16](https://huggingface.co/dhmeltzer/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T21:13:05.130565](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16/blob/main/results_2023-08-29T21%3A13%3A05.130565.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4690258680406254,
"acc_stderr": 0.03526595221704812,
"acc_norm": 0.47314486143663886,
"acc_norm_stderr": 0.03525068980912334,
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.43311675077794465,
"mc2_stderr": 0.014085064609011494
},
"harness|arc:challenge|25": {
"acc": 0.49829351535836175,
"acc_stderr": 0.014611305705056995,
"acc_norm": 0.5409556313993175,
"acc_norm_stderr": 0.014562291073601227
},
"harness|hellaswag|10": {
"acc": 0.5910177255526787,
"acc_stderr": 0.00490641198447679,
"acc_norm": 0.7913762198765186,
"acc_norm_stderr": 0.004054944548370489
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4276315789473684,
"acc_stderr": 0.04026097083296558,
"acc_norm": 0.4276315789473684,
"acc_norm_stderr": 0.04026097083296558
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.46037735849056605,
"acc_stderr": 0.030676096599389184,
"acc_norm": 0.46037735849056605,
"acc_norm_stderr": 0.030676096599389184
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04155319955593146,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04155319955593146
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3988439306358382,
"acc_stderr": 0.03733626655383509,
"acc_norm": 0.3988439306358382,
"acc_norm_stderr": 0.03733626655383509
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.03793281185307809,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.03793281185307809
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.032232762667117124,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.032232762667117124
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.043391383225798615,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.043391383225798615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4206896551724138,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.4206896551724138,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.022418042891113946,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.022418042891113946
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127155,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127155
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5096774193548387,
"acc_stderr": 0.02843867799890955,
"acc_norm": 0.5096774193548387,
"acc_norm_stderr": 0.02843867799890955
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33497536945812806,
"acc_stderr": 0.033208527423483104,
"acc_norm": 0.33497536945812806,
"acc_norm_stderr": 0.033208527423483104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6121212121212121,
"acc_stderr": 0.038049136539710114,
"acc_norm": 0.6121212121212121,
"acc_norm_stderr": 0.038049136539710114
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.48484848484848486,
"acc_stderr": 0.0356071651653106,
"acc_norm": 0.48484848484848486,
"acc_norm_stderr": 0.0356071651653106
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6683937823834197,
"acc_stderr": 0.03397636541089118,
"acc_norm": 0.6683937823834197,
"acc_norm_stderr": 0.03397636541089118
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.45384615384615384,
"acc_stderr": 0.025242770987126177,
"acc_norm": 0.45384615384615384,
"acc_norm_stderr": 0.025242770987126177
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945284,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945284
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42016806722689076,
"acc_stderr": 0.03206183783236152,
"acc_norm": 0.42016806722689076,
"acc_norm_stderr": 0.03206183783236152
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.03734535676787198,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.03734535676787198
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6256880733944954,
"acc_stderr": 0.020748959408988306,
"acc_norm": 0.6256880733944954,
"acc_norm_stderr": 0.020748959408988306
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.030851992993257013,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.030851992993257013
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5245098039215687,
"acc_stderr": 0.03505093194348798,
"acc_norm": 0.5245098039215687,
"acc_norm_stderr": 0.03505093194348798
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5949367088607594,
"acc_stderr": 0.03195514741370671,
"acc_norm": 0.5949367088607594,
"acc_norm_stderr": 0.03195514741370671
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5515695067264574,
"acc_stderr": 0.03337883736255098,
"acc_norm": 0.5515695067264574,
"acc_norm_stderr": 0.03337883736255098
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5725190839694656,
"acc_stderr": 0.04338920305792401,
"acc_norm": 0.5725190839694656,
"acc_norm_stderr": 0.04338920305792401
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6115702479338843,
"acc_stderr": 0.04449270350068383,
"acc_norm": 0.6115702479338843,
"acc_norm_stderr": 0.04449270350068383
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.04832853553437056,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.04832853553437056
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.48466257668711654,
"acc_stderr": 0.039265223787088424,
"acc_norm": 0.48466257668711654,
"acc_norm_stderr": 0.039265223787088424
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.046355501356099754,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.046355501356099754
},
"harness|hendrycksTest-management|5": {
"acc": 0.5825242718446602,
"acc_stderr": 0.048828405482122375,
"acc_norm": 0.5825242718446602,
"acc_norm_stderr": 0.048828405482122375
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6923076923076923,
"acc_stderr": 0.03023638994217308,
"acc_norm": 0.6923076923076923,
"acc_norm_stderr": 0.03023638994217308
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6424010217113666,
"acc_stderr": 0.01713948899880328,
"acc_norm": 0.6424010217113666,
"acc_norm_stderr": 0.01713948899880328
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5028901734104047,
"acc_stderr": 0.026918645383239004,
"acc_norm": 0.5028901734104047,
"acc_norm_stderr": 0.026918645383239004
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4934640522875817,
"acc_stderr": 0.028627470550556047,
"acc_norm": 0.4934640522875817,
"acc_norm_stderr": 0.028627470550556047
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.594855305466238,
"acc_stderr": 0.027882383791325953,
"acc_norm": 0.594855305466238,
"acc_norm_stderr": 0.027882383791325953
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.027815973433878014,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.027815973433878014
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.028663820147199492,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.028663820147199492
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3559322033898305,
"acc_stderr": 0.012228645537277566,
"acc_norm": 0.3559322033898305,
"acc_norm_stderr": 0.012228645537277566
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.030320243265004144,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.030320243265004144
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.02008736207670286,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.02008736207670286
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5265306122448979,
"acc_stderr": 0.03196412734523272,
"acc_norm": 0.5265306122448979,
"acc_norm_stderr": 0.03196412734523272
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6417910447761194,
"acc_stderr": 0.03390393042268813,
"acc_norm": 0.6417910447761194,
"acc_norm_stderr": 0.03390393042268813
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7134502923976608,
"acc_stderr": 0.03467826685703826,
"acc_norm": 0.7134502923976608,
"acc_norm_stderr": 0.03467826685703826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.43311675077794465,
"mc2_stderr": 0.014085064609011494
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
tomcat2064/repo_name | 2023-08-29T16:10:25.000Z | [
"region:us"
] | tomcat2064 | null | null | null | 0 | 0 | Entry not found |
tomcat2064/new_look_dataset | 2023-08-29T17:04:33.000Z | [
"region:us"
] | tomcat2064 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_nathan0__mpt_delta_tuned_model_v2 | 2023-08-29T16:17:36.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of nathan0/mpt_delta_tuned_model_v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nathan0/mpt_delta_tuned_model_v2](https://huggingface.co/nathan0/mpt_delta_tuned_model_v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nathan0__mpt_delta_tuned_model_v2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T16:16:19.015155](https://huggingface.co/datasets/open-llm-leaderboard/details_nathan0__mpt_delta_tuned_model_v2/blob/main/results_2023-08-29T16%3A16%3A19.015155.json):\n\
\n```python\n{\n \"all\": {\n \"acc\": 0.2950832179561353,\n \"\
acc_stderr\": 0.0329561063051657,\n \"acc_norm\": 0.29907052345099405,\n\
\ \"acc_norm_stderr\": 0.03294521260985216,\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871108,\n \"mc2\": 0.35471976554662815,\n\
\ \"mc2_stderr\": 0.013741277408130734\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.45819112627986347,\n \"acc_stderr\": 0.014560220308714697,\n\
\ \"acc_norm\": 0.5068259385665529,\n \"acc_norm_stderr\": 0.014610029151379813\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5774746066520613,\n\
\ \"acc_stderr\": 0.004929517011508221,\n \"acc_norm\": 0.7640908185620394,\n\
\ \"acc_norm_stderr\": 0.0042369801453443065\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2740740740740741,\n\
\ \"acc_stderr\": 0.03853254836552003,\n \"acc_norm\": 0.2740740740740741,\n\
\ \"acc_norm_stderr\": 0.03853254836552003\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2565789473684211,\n \"acc_stderr\": 0.0355418036802569,\n\
\ \"acc_norm\": 0.2565789473684211,\n \"acc_norm_stderr\": 0.0355418036802569\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n\
\ \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3169811320754717,\n \"acc_stderr\": 0.028637235639800918,\n\
\ \"acc_norm\": 0.3169811320754717,\n \"acc_norm_stderr\": 0.028637235639800918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3055555555555556,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.3055555555555556,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n\
\ \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n\
\ \"acc_stderr\": 0.03156809362703175,\n \"acc_norm\": 0.21965317919075145,\n\
\ \"acc_norm_stderr\": 0.03156809362703175\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.31063829787234043,\n \"acc_stderr\": 0.03025123757921317,\n\
\ \"acc_norm\": 0.31063829787234043,\n \"acc_norm_stderr\": 0.03025123757921317\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537314,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537314\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.31724137931034485,\n \"acc_stderr\": 0.03878352372138622,\n\
\ \"acc_norm\": 0.31724137931034485,\n \"acc_norm_stderr\": 0.03878352372138622\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3148148148148148,\n \"acc_stderr\": 0.023919984164047736,\n \"\
acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.023919984164047736\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n\
\ \"acc_stderr\": 0.03512207412302052,\n \"acc_norm\": 0.19047619047619047,\n\
\ \"acc_norm_stderr\": 0.03512207412302052\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.2903225806451613,\n \"acc_stderr\": 0.025822106119415898,\n \"\
acc_norm\": 0.2903225806451613,\n \"acc_norm_stderr\": 0.025822106119415898\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.26108374384236455,\n \"acc_stderr\": 0.03090379695211449,\n \"\
acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.03090379695211449\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885415,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885415\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.30303030303030304,\n \"acc_stderr\": 0.03274287914026868,\n \"\
acc_norm\": 0.30303030303030304,\n \"acc_norm_stderr\": 0.03274287914026868\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.31088082901554404,\n \"acc_stderr\": 0.03340361906276585,\n\
\ \"acc_norm\": 0.31088082901554404,\n \"acc_norm_stderr\": 0.03340361906276585\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.30512820512820515,\n \"acc_stderr\": 0.023346335293325887,\n\
\ \"acc_norm\": 0.30512820512820515,\n \"acc_norm_stderr\": 0.023346335293325887\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145675,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145675\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.029344572500634346,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.029344572500634346\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.26972477064220185,\n \"acc_stderr\": 0.019028486711115445,\n \"\
acc_norm\": 0.26972477064220185,\n \"acc_norm_stderr\": 0.019028486711115445\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.18518518518518517,\n \"acc_stderr\": 0.02649191472735514,\n \"\
acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.02649191472735514\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604246,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604246\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2742616033755274,\n \"acc_stderr\": 0.029041333510598035,\n \
\ \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.029041333510598035\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3632286995515695,\n\
\ \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.3632286995515695,\n\
\ \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.3884297520661157,\n \"acc_stderr\": 0.04449270350068382,\n \"\
acc_norm\": 0.3884297520661157,\n \"acc_norm_stderr\": 0.04449270350068382\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.37037037037037035,\n\
\ \"acc_stderr\": 0.04668408033024932,\n \"acc_norm\": 0.37037037037037035,\n\
\ \"acc_norm_stderr\": 0.04668408033024932\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2822085889570552,\n \"acc_stderr\": 0.03536117886664743,\n\
\ \"acc_norm\": 0.2822085889570552,\n \"acc_norm_stderr\": 0.03536117886664743\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.045723723587374296,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.045723723587374296\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2815533980582524,\n \"acc_stderr\": 0.044532548363264673,\n\
\ \"acc_norm\": 0.2815533980582524,\n \"acc_norm_stderr\": 0.044532548363264673\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3247863247863248,\n\
\ \"acc_stderr\": 0.03067902276549883,\n \"acc_norm\": 0.3247863247863248,\n\
\ \"acc_norm_stderr\": 0.03067902276549883\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.29757343550446996,\n\
\ \"acc_stderr\": 0.01634911191290943,\n \"acc_norm\": 0.29757343550446996,\n\
\ \"acc_norm_stderr\": 0.01634911191290943\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2774566473988439,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.2774566473988439,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2679738562091503,\n \"acc_stderr\": 0.02536060379624256,\n\
\ \"acc_norm\": 0.2679738562091503,\n \"acc_norm_stderr\": 0.02536060379624256\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2861736334405145,\n\
\ \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.2861736334405145,\n\
\ \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2839506172839506,\n \"acc_stderr\": 0.02508947852376513,\n\
\ \"acc_norm\": 0.2839506172839506,\n \"acc_norm_stderr\": 0.02508947852376513\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2765957446808511,\n \"acc_stderr\": 0.026684564340461,\n \
\ \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.026684564340461\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.25684485006518903,\n\
\ \"acc_stderr\": 0.011158455853098857,\n \"acc_norm\": 0.25684485006518903,\n\
\ \"acc_norm_stderr\": 0.011158455853098857\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.02439819298665492,\n\
\ \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.02439819298665492\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.31209150326797386,\n \"acc_stderr\": 0.018745011201277657,\n \
\ \"acc_norm\": 0.31209150326797386,\n \"acc_norm_stderr\": 0.018745011201277657\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.36363636363636365,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.36363636363636365,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.31020408163265306,\n \"acc_stderr\": 0.029613459872484378,\n\
\ \"acc_norm\": 0.31020408163265306,\n \"acc_norm_stderr\": 0.029613459872484378\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n\
\ \"acc_stderr\": 0.029929415408348398,\n \"acc_norm\": 0.23383084577114427,\n\
\ \"acc_norm_stderr\": 0.029929415408348398\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3493975903614458,\n\
\ \"acc_stderr\": 0.0371172519074075,\n \"acc_norm\": 0.3493975903614458,\n\
\ \"acc_norm_stderr\": 0.0371172519074075\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.26900584795321636,\n \"acc_stderr\": 0.03401052620104089,\n\
\ \"acc_norm\": 0.26900584795321636,\n \"acc_norm_stderr\": 0.03401052620104089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871108,\n \"mc2\": 0.35471976554662815,\n\
\ \"mc2_stderr\": 0.013741277408130734\n }\n}\n```"
repo_url: https://huggingface.co/nathan0/mpt_delta_tuned_model_v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|arc:challenge|25_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hellaswag|10_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T16:16:19.015155.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T16:16:19.015155.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T16:16:19.015155.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T16:16:19.015155.parquet'
- config_name: results
data_files:
- split: 2023_08_29T16_16_19.015155
path:
- results_2023-08-29T16:16:19.015155.parquet
- split: latest
path:
- results_2023-08-29T16:16:19.015155.parquet
---
# Dataset Card for Evaluation run of nathan0/mpt_delta_tuned_model_v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/nathan0/mpt_delta_tuned_model_v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [nathan0/mpt_delta_tuned_model_v2](https://huggingface.co/nathan0/mpt_delta_tuned_model_v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nathan0__mpt_delta_tuned_model_v2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T16:16:19.015155](https://huggingface.co/datasets/open-llm-leaderboard/details_nathan0__mpt_delta_tuned_model_v2/blob/main/results_2023-08-29T16%3A16%3A19.015155.json):
```python
{
"all": {
"acc": 0.2950832179561353,
"acc_stderr": 0.0329561063051657,
"acc_norm": 0.29907052345099405,
"acc_norm_stderr": 0.03294521260985216,
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871108,
"mc2": 0.35471976554662815,
"mc2_stderr": 0.013741277408130734
},
"harness|arc:challenge|25": {
"acc": 0.45819112627986347,
"acc_stderr": 0.014560220308714697,
"acc_norm": 0.5068259385665529,
"acc_norm_stderr": 0.014610029151379813
},
"harness|hellaswag|10": {
"acc": 0.5774746066520613,
"acc_stderr": 0.004929517011508221,
"acc_norm": 0.7640908185620394,
"acc_norm_stderr": 0.0042369801453443065
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.03853254836552003,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.03853254836552003
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2565789473684211,
"acc_stderr": 0.0355418036802569,
"acc_norm": 0.2565789473684211,
"acc_norm_stderr": 0.0355418036802569
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3169811320754717,
"acc_stderr": 0.028637235639800918,
"acc_norm": 0.3169811320754717,
"acc_norm_stderr": 0.028637235639800918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.03156809362703175,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.03156809362703175
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.31063829787234043,
"acc_stderr": 0.03025123757921317,
"acc_norm": 0.31063829787234043,
"acc_norm_stderr": 0.03025123757921317
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537314,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537314
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.31724137931034485,
"acc_stderr": 0.03878352372138622,
"acc_norm": 0.31724137931034485,
"acc_norm_stderr": 0.03878352372138622
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.023919984164047736,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.023919984164047736
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.03512207412302052,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.03512207412302052
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2903225806451613,
"acc_stderr": 0.025822106119415898,
"acc_norm": 0.2903225806451613,
"acc_norm_stderr": 0.025822106119415898
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.03090379695211449,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.03090379695211449
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.30303030303030304,
"acc_stderr": 0.03274287914026868,
"acc_norm": 0.30303030303030304,
"acc_norm_stderr": 0.03274287914026868
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.31088082901554404,
"acc_stderr": 0.03340361906276585,
"acc_norm": 0.31088082901554404,
"acc_norm_stderr": 0.03340361906276585
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.30512820512820515,
"acc_stderr": 0.023346335293325887,
"acc_norm": 0.30512820512820515,
"acc_norm_stderr": 0.023346335293325887
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145675,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145675
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.029344572500634346,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.029344572500634346
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.26972477064220185,
"acc_stderr": 0.019028486711115445,
"acc_norm": 0.26972477064220185,
"acc_norm_stderr": 0.019028486711115445
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.02649191472735514,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.02649191472735514
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.029041333510598035,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.029041333510598035
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3632286995515695,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.3632286995515695,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2824427480916031,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.2824427480916031,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3884297520661157,
"acc_stderr": 0.04449270350068382,
"acc_norm": 0.3884297520661157,
"acc_norm_stderr": 0.04449270350068382
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.04668408033024932,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.04668408033024932
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2822085889570552,
"acc_stderr": 0.03536117886664743,
"acc_norm": 0.2822085889570552,
"acc_norm_stderr": 0.03536117886664743
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.045723723587374296,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.045723723587374296
},
"harness|hendrycksTest-management|5": {
"acc": 0.2815533980582524,
"acc_stderr": 0.044532548363264673,
"acc_norm": 0.2815533980582524,
"acc_norm_stderr": 0.044532548363264673
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3247863247863248,
"acc_stderr": 0.03067902276549883,
"acc_norm": 0.3247863247863248,
"acc_norm_stderr": 0.03067902276549883
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.29757343550446996,
"acc_stderr": 0.01634911191290943,
"acc_norm": 0.29757343550446996,
"acc_norm_stderr": 0.01634911191290943
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2774566473988439,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.2774566473988439,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2679738562091503,
"acc_stderr": 0.02536060379624256,
"acc_norm": 0.2679738562091503,
"acc_norm_stderr": 0.02536060379624256
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2861736334405145,
"acc_stderr": 0.025670259242188936,
"acc_norm": 0.2861736334405145,
"acc_norm_stderr": 0.025670259242188936
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2839506172839506,
"acc_stderr": 0.02508947852376513,
"acc_norm": 0.2839506172839506,
"acc_norm_stderr": 0.02508947852376513
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.026684564340461,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.026684564340461
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.25684485006518903,
"acc_stderr": 0.011158455853098857,
"acc_norm": 0.25684485006518903,
"acc_norm_stderr": 0.011158455853098857
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20220588235294118,
"acc_stderr": 0.02439819298665492,
"acc_norm": 0.20220588235294118,
"acc_norm_stderr": 0.02439819298665492
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.31209150326797386,
"acc_stderr": 0.018745011201277657,
"acc_norm": 0.31209150326797386,
"acc_norm_stderr": 0.018745011201277657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.36363636363636365,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.36363636363636365,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.31020408163265306,
"acc_stderr": 0.029613459872484378,
"acc_norm": 0.31020408163265306,
"acc_norm_stderr": 0.029613459872484378
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.029929415408348398,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.029929415408348398
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3493975903614458,
"acc_stderr": 0.0371172519074075,
"acc_norm": 0.3493975903614458,
"acc_norm_stderr": 0.0371172519074075
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.26900584795321636,
"acc_stderr": 0.03401052620104089,
"acc_norm": 0.26900584795321636,
"acc_norm_stderr": 0.03401052620104089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871108,
"mc2": 0.35471976554662815,
"mc2_stderr": 0.013741277408130734
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
tog/galleon-llama2-1k | 2023-08-29T16:57:37.000Z | [
"region:us"
] | tog | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 159012.9
num_examples: 900
- name: test
num_bytes: 17668.1
num_examples: 100
download_size: 89959
dataset_size: 176681.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "galleon-llama2-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tog/galleon-llama2-27k | 2023-08-30T08:33:18.000Z | [
"region:us"
] | tog | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4280355.9
num_examples: 24300
- name: test
num_bytes: 475595.1
num_examples: 2700
download_size: 2318132
dataset_size: 4755951.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "galleon-llama2-27k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KatMarie/eu_test6 | 2023-08-30T09:58:54.000Z | [
"region:us"
] | KatMarie | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2429668
num_examples: 41376
download_size: 1661037
dataset_size: 2429668
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "eu_test6"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
St4n/datasets | 2023-08-29T16:43:13.000Z | [
"region:us"
] | St4n | null | null | null | 0 | 0 | Entry not found |
qazisaad/llama_2_optimized_product_titles-esci-temp | 2023-08-30T06:12:34.000Z | [
"region:us"
] | qazisaad | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: level_0
dtype: int64
- name: index
dtype: int64
- name: product_title
dtype: string
- name: average_score
dtype: float64
- name: total_score
dtype: float64
- name: text
dtype: string
- name: preds
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1526227
num_examples: 480
download_size: 300628
dataset_size: 1526227
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama_2_optimized_product_titles-esci-temp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gipul/Unknown | 2023-08-29T16:48:54.000Z | [
"region:us"
] | gipul | null | null | null | 0 | 0 | Entry not found |
DMT1126/Voice | 2023-08-29T16:54:14.000Z | [
"license:unknown",
"region:us"
] | DMT1126 | null | null | null | 0 | 0 | ---
license: unknown
---
|
Musha-the-Yusha/mushi-snli-llama2-3k | 2023-08-29T17:36:32.000Z | [
"region:us"
] | Musha-the-Yusha | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2020257
num_examples: 10000
download_size: 723568
dataset_size: 2020257
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "mushi-snli-llama2-3k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jakir057/augmented_notes_9000 | 2023-08-29T17:03:08.000Z | [
"region:us"
] | Jakir057 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '10'
'1': '100'
'2': '1000'
'3': '2'
'4': '20'
'5': '200'
'6': '5'
'7': '50'
'8': '500'
splits:
- name: train
num_bytes: 62693081.95
num_examples: 7650
- name: test
num_bytes: 11383838.7
num_examples: 1350
download_size: 74957422
dataset_size: 74076920.65
---
# Dataset Card for "augmented_notes_9000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ABTHF/your-dataset-name | 2023-08-29T17:06:58.000Z | [
"region:us"
] | ABTHF | null | null | null | 0 | 0 | Entry not found |
cnulatienpo/firstwritingdataset | 2023-08-29T17:28:22.000Z | [
"region:us"
] | cnulatienpo | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_TheBloke__WizardLM-13B-V1.1-GPTQ | 2023-08-29T17:25:31.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheBloke/WizardLM-13B-V1.1-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/WizardLM-13B-V1.1-GPTQ](https://huggingface.co/TheBloke/WizardLM-13B-V1.1-GPTQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__WizardLM-13B-V1.1-GPTQ\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T17:24:13.256665](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__WizardLM-13B-V1.1-GPTQ/blob/main/results_2023-08-29T17%3A24%3A13.256665.json):\n\
\n```python\n{\n \"all\": {\n \"acc\": 0.4989274387080137,\n \"\
acc_stderr\": 0.03528129823407584,\n \"acc_norm\": 0.5027041835411654,\n\
\ \"acc_norm_stderr\": 0.0352638484724738,\n \"mc1\": 0.37821297429620565,\n\
\ \"mc1_stderr\": 0.016976335907546866,\n \"mc2\": 0.5435298001783399,\n\
\ \"mc2_stderr\": 0.015594790355285697\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5648464163822525,\n \"acc_stderr\": 0.014487986197186045,\n\
\ \"acc_norm\": 0.5853242320819113,\n \"acc_norm_stderr\": 0.014397070564409174\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6042620991834295,\n\
\ \"acc_stderr\": 0.004880092083408045,\n \"acc_norm\": 0.8066122286397132,\n\
\ \"acc_norm_stderr\": 0.003941471781664183\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4934210526315789,\n \"acc_stderr\": 0.040685900502249704,\n\
\ \"acc_norm\": 0.4934210526315789,\n \"acc_norm_stderr\": 0.040685900502249704\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5358490566037736,\n \"acc_stderr\": 0.030693675018458003,\n\
\ \"acc_norm\": 0.5358490566037736,\n \"acc_norm_stderr\": 0.030693675018458003\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4236111111111111,\n\
\ \"acc_stderr\": 0.041321250197233685,\n \"acc_norm\": 0.4236111111111111,\n\
\ \"acc_norm_stderr\": 0.041321250197233685\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.44508670520231214,\n\
\ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.44508670520231214,\n\
\ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.031778212502369216,\n\
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.031778212502369216\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.43448275862068964,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.43448275862068964,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.29894179894179895,\n \"acc_stderr\": 0.023577604791655802,\n \"\
acc_norm\": 0.29894179894179895,\n \"acc_norm_stderr\": 0.023577604791655802\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5709677419354838,\n \"acc_stderr\": 0.028156036538233193,\n \"\
acc_norm\": 0.5709677419354838,\n \"acc_norm_stderr\": 0.028156036538233193\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3448275862068966,\n \"acc_stderr\": 0.03344283744280458,\n \"\
acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.03344283744280458\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6242424242424243,\n \"acc_stderr\": 0.03781887353205982,\n\
\ \"acc_norm\": 0.6242424242424243,\n \"acc_norm_stderr\": 0.03781887353205982\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5909090909090909,\n \"acc_stderr\": 0.03502975799413007,\n \"\
acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.03502975799413007\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6787564766839378,\n \"acc_stderr\": 0.033699508685490674,\n\
\ \"acc_norm\": 0.6787564766839378,\n \"acc_norm_stderr\": 0.033699508685490674\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4512820512820513,\n \"acc_stderr\": 0.02523038123893484,\n \
\ \"acc_norm\": 0.4512820512820513,\n \"acc_norm_stderr\": 0.02523038123893484\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.02659393910184407,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.02659393910184407\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5084033613445378,\n \"acc_stderr\": 0.03247390276569669,\n \
\ \"acc_norm\": 0.5084033613445378,\n \"acc_norm_stderr\": 0.03247390276569669\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6623853211009174,\n \"acc_stderr\": 0.02027526598663892,\n \"\
acc_norm\": 0.6623853211009174,\n \"acc_norm_stderr\": 0.02027526598663892\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"\
acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6421568627450981,\n \"acc_stderr\": 0.033644872860883,\n \"acc_norm\"\
: 0.6421568627450981,\n \"acc_norm_stderr\": 0.033644872860883\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.6582278481012658,\n \"acc_stderr\": 0.03087453753755362,\n \"\
acc_norm\": 0.6582278481012658,\n \"acc_norm_stderr\": 0.03087453753755362\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4798206278026906,\n\
\ \"acc_stderr\": 0.033530461674123,\n \"acc_norm\": 0.4798206278026906,\n\
\ \"acc_norm_stderr\": 0.033530461674123\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5725190839694656,\n \"acc_stderr\": 0.043389203057924,\n\
\ \"acc_norm\": 0.5725190839694656,\n \"acc_norm_stderr\": 0.043389203057924\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6115702479338843,\n \"acc_stderr\": 0.04449270350068382,\n \"\
acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.04449270350068382\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n\
\ \"acc_stderr\": 0.04820403072760627,\n \"acc_norm\": 0.5370370370370371,\n\
\ \"acc_norm_stderr\": 0.04820403072760627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5030674846625767,\n \"acc_stderr\": 0.03928297078179662,\n\
\ \"acc_norm\": 0.5030674846625767,\n \"acc_norm_stderr\": 0.03928297078179662\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6504854368932039,\n \"acc_stderr\": 0.047211885060971716,\n\
\ \"acc_norm\": 0.6504854368932039,\n \"acc_norm_stderr\": 0.047211885060971716\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.717948717948718,\n\
\ \"acc_stderr\": 0.029480360549541194,\n \"acc_norm\": 0.717948717948718,\n\
\ \"acc_norm_stderr\": 0.029480360549541194\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6615581098339719,\n\
\ \"acc_stderr\": 0.01692086958621067,\n \"acc_norm\": 0.6615581098339719,\n\
\ \"acc_norm_stderr\": 0.01692086958621067\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5606936416184971,\n \"acc_stderr\": 0.02672003438051499,\n\
\ \"acc_norm\": 0.5606936416184971,\n \"acc_norm_stderr\": 0.02672003438051499\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.29497206703910617,\n\
\ \"acc_stderr\": 0.015251931579208185,\n \"acc_norm\": 0.29497206703910617,\n\
\ \"acc_norm_stderr\": 0.015251931579208185\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.027996723180631445,\n\
\ \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.027996723180631445\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5369774919614148,\n\
\ \"acc_stderr\": 0.02832032583010591,\n \"acc_norm\": 0.5369774919614148,\n\
\ \"acc_norm_stderr\": 0.02832032583010591\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.027777777777777797,\n\
\ \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.027777777777777797\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.39361702127659576,\n \"acc_stderr\": 0.029144544781596147,\n \
\ \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.029144544781596147\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.40352020860495436,\n\
\ \"acc_stderr\": 0.012530241301193195,\n \"acc_norm\": 0.40352020860495436,\n\
\ \"acc_norm_stderr\": 0.012530241301193195\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4889705882352941,\n \"acc_stderr\": 0.03036544647727568,\n\
\ \"acc_norm\": 0.4889705882352941,\n \"acc_norm_stderr\": 0.03036544647727568\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.47875816993464054,\n \"acc_stderr\": 0.020209572388600255,\n \
\ \"acc_norm\": 0.47875816993464054,\n \"acc_norm_stderr\": 0.020209572388600255\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5636363636363636,\n\
\ \"acc_stderr\": 0.04750185058907296,\n \"acc_norm\": 0.5636363636363636,\n\
\ \"acc_norm_stderr\": 0.04750185058907296\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5877551020408164,\n \"acc_stderr\": 0.031512360446742674,\n\
\ \"acc_norm\": 0.5877551020408164,\n \"acc_norm_stderr\": 0.031512360446742674\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6567164179104478,\n\
\ \"acc_stderr\": 0.03357379665433431,\n \"acc_norm\": 0.6567164179104478,\n\
\ \"acc_norm_stderr\": 0.03357379665433431\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7251461988304093,\n \"acc_stderr\": 0.034240429246915824,\n\
\ \"acc_norm\": 0.7251461988304093,\n \"acc_norm_stderr\": 0.034240429246915824\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37821297429620565,\n\
\ \"mc1_stderr\": 0.016976335907546866,\n \"mc2\": 0.5435298001783399,\n\
\ \"mc2_stderr\": 0.015594790355285697\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/WizardLM-13B-V1.1-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|arc:challenge|25_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hellaswag|10_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T17:24:13.256665.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T17:24:13.256665.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T17:24:13.256665.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T17:24:13.256665.parquet'
- config_name: results
data_files:
- split: 2023_08_29T17_24_13.256665
path:
- results_2023-08-29T17:24:13.256665.parquet
- split: latest
path:
- results_2023-08-29T17:24:13.256665.parquet
---
# Dataset Card for Evaluation run of TheBloke/WizardLM-13B-V1.1-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/WizardLM-13B-V1.1-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/WizardLM-13B-V1.1-GPTQ](https://huggingface.co/TheBloke/WizardLM-13B-V1.1-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__WizardLM-13B-V1.1-GPTQ",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T17:24:13.256665](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__WizardLM-13B-V1.1-GPTQ/blob/main/results_2023-08-29T17%3A24%3A13.256665.json):
```python
{
"all": {
"acc": 0.4989274387080137,
"acc_stderr": 0.03528129823407584,
"acc_norm": 0.5027041835411654,
"acc_norm_stderr": 0.0352638484724738,
"mc1": 0.37821297429620565,
"mc1_stderr": 0.016976335907546866,
"mc2": 0.5435298001783399,
"mc2_stderr": 0.015594790355285697
},
"harness|arc:challenge|25": {
"acc": 0.5648464163822525,
"acc_stderr": 0.014487986197186045,
"acc_norm": 0.5853242320819113,
"acc_norm_stderr": 0.014397070564409174
},
"harness|hellaswag|10": {
"acc": 0.6042620991834295,
"acc_stderr": 0.004880092083408045,
"acc_norm": 0.8066122286397132,
"acc_norm_stderr": 0.003941471781664183
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4934210526315789,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.4934210526315789,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5358490566037736,
"acc_stderr": 0.030693675018458003,
"acc_norm": 0.5358490566037736,
"acc_norm_stderr": 0.030693675018458003
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4236111111111111,
"acc_stderr": 0.041321250197233685,
"acc_norm": 0.4236111111111111,
"acc_norm_stderr": 0.041321250197233685
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.44508670520231214,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.44508670520231214,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.031778212502369216,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.031778212502369216
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.43448275862068964,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.43448275862068964,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29894179894179895,
"acc_stderr": 0.023577604791655802,
"acc_norm": 0.29894179894179895,
"acc_norm_stderr": 0.023577604791655802
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5709677419354838,
"acc_stderr": 0.028156036538233193,
"acc_norm": 0.5709677419354838,
"acc_norm_stderr": 0.028156036538233193
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.03344283744280458,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.03344283744280458
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6242424242424243,
"acc_stderr": 0.03781887353205982,
"acc_norm": 0.6242424242424243,
"acc_norm_stderr": 0.03781887353205982
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.03502975799413007,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.03502975799413007
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6787564766839378,
"acc_stderr": 0.033699508685490674,
"acc_norm": 0.6787564766839378,
"acc_norm_stderr": 0.033699508685490674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4512820512820513,
"acc_stderr": 0.02523038123893484,
"acc_norm": 0.4512820512820513,
"acc_norm_stderr": 0.02523038123893484
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.02659393910184407,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.02659393910184407
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5084033613445378,
"acc_stderr": 0.03247390276569669,
"acc_norm": 0.5084033613445378,
"acc_norm_stderr": 0.03247390276569669
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6623853211009174,
"acc_stderr": 0.02027526598663892,
"acc_norm": 0.6623853211009174,
"acc_norm_stderr": 0.02027526598663892
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6421568627450981,
"acc_stderr": 0.033644872860883,
"acc_norm": 0.6421568627450981,
"acc_norm_stderr": 0.033644872860883
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6582278481012658,
"acc_stderr": 0.03087453753755362,
"acc_norm": 0.6582278481012658,
"acc_norm_stderr": 0.03087453753755362
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4798206278026906,
"acc_stderr": 0.033530461674123,
"acc_norm": 0.4798206278026906,
"acc_norm_stderr": 0.033530461674123
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5725190839694656,
"acc_stderr": 0.043389203057924,
"acc_norm": 0.5725190839694656,
"acc_norm_stderr": 0.043389203057924
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6115702479338843,
"acc_stderr": 0.04449270350068382,
"acc_norm": 0.6115702479338843,
"acc_norm_stderr": 0.04449270350068382
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760627,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5030674846625767,
"acc_stderr": 0.03928297078179662,
"acc_norm": 0.5030674846625767,
"acc_norm_stderr": 0.03928297078179662
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285714,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285714
},
"harness|hendrycksTest-management|5": {
"acc": 0.6504854368932039,
"acc_stderr": 0.047211885060971716,
"acc_norm": 0.6504854368932039,
"acc_norm_stderr": 0.047211885060971716
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.717948717948718,
"acc_stderr": 0.029480360549541194,
"acc_norm": 0.717948717948718,
"acc_norm_stderr": 0.029480360549541194
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6615581098339719,
"acc_stderr": 0.01692086958621067,
"acc_norm": 0.6615581098339719,
"acc_norm_stderr": 0.01692086958621067
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5606936416184971,
"acc_stderr": 0.02672003438051499,
"acc_norm": 0.5606936416184971,
"acc_norm_stderr": 0.02672003438051499
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.29497206703910617,
"acc_stderr": 0.015251931579208185,
"acc_norm": 0.29497206703910617,
"acc_norm_stderr": 0.015251931579208185
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.027996723180631445,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.027996723180631445
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5369774919614148,
"acc_stderr": 0.02832032583010591,
"acc_norm": 0.5369774919614148,
"acc_norm_stderr": 0.02832032583010591
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.027777777777777797,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.027777777777777797
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.39361702127659576,
"acc_stderr": 0.029144544781596147,
"acc_norm": 0.39361702127659576,
"acc_norm_stderr": 0.029144544781596147
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.40352020860495436,
"acc_stderr": 0.012530241301193195,
"acc_norm": 0.40352020860495436,
"acc_norm_stderr": 0.012530241301193195
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4889705882352941,
"acc_stderr": 0.03036544647727568,
"acc_norm": 0.4889705882352941,
"acc_norm_stderr": 0.03036544647727568
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.47875816993464054,
"acc_stderr": 0.020209572388600255,
"acc_norm": 0.47875816993464054,
"acc_norm_stderr": 0.020209572388600255
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.04750185058907296,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.04750185058907296
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5877551020408164,
"acc_stderr": 0.031512360446742674,
"acc_norm": 0.5877551020408164,
"acc_norm_stderr": 0.031512360446742674
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6567164179104478,
"acc_stderr": 0.03357379665433431,
"acc_norm": 0.6567164179104478,
"acc_norm_stderr": 0.03357379665433431
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7251461988304093,
"acc_stderr": 0.034240429246915824,
"acc_norm": 0.7251461988304093,
"acc_norm_stderr": 0.034240429246915824
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37821297429620565,
"mc1_stderr": 0.016976335907546866,
"mc2": 0.5435298001783399,
"mc2_stderr": 0.015594790355285697
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
amasing7/sf-dummy | 2023-08-29T17:59:12.000Z | [
"region:us"
] | amasing7 | null | null | null | 0 | 0 | Entry not found |
Seenka/directv-zocalos_1.0fps_21-08-2023_24-08-2023 | 2023-08-29T18:07:16.000Z | [
"region:us"
] | Seenka | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: image
dtype: image
- name: image_filename
dtype: string
- name: frame_time
dtype: time64[us]
- name: video_storage_path
dtype: string
- name: zocalo_id
dtype: string
- name: frame_number
dtype: int64
- name: is_L_shape
dtype: bool
- name: horizontal_check
dtype: bool
- name: vertical_check
dtype: bool
- name: black_image
dtype: bool
- name: horizontal_xmin
dtype: int64
- name: horizontal_xmax
dtype: int64
- name: horizontal_ymin
dtype: int64
- name: horizontal_ymax
dtype: int64
- name: vertical_xmin
dtype: int64
- name: vertical_xmax
dtype: int64
- name: vertical_ymin
dtype: int64
- name: vertical_ymax
dtype: int64
- name: cropped_image_horizontal
dtype: image
- name: cropped_image_vertical
dtype: 'null'
- name: width
dtype: int64
- name: height
dtype: int64
- name: embedding_horizontal
sequence: float32
splits:
- name: train
num_bytes: 8554665.0
num_examples: 10
download_size: 4263397
dataset_size: 8554665.0
---
# Dataset Card for "directv-zocalos_1.0fps_21-08-2023_24-08-2023"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_TheBloke__vicuna-13b-v1.3.0-GPTQ | 2023-08-29T17:38:03.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheBloke/vicuna-13b-v1.3.0-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/vicuna-13b-v1.3.0-GPTQ](https://huggingface.co/TheBloke/vicuna-13b-v1.3.0-GPTQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__vicuna-13b-v1.3.0-GPTQ\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T17:36:46.584597](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__vicuna-13b-v1.3.0-GPTQ/blob/main/results_2023-08-29T17%3A36%3A46.584597.json):\n\
\n```python\n{\n \"all\": {\n \"acc\": 0.5208688458263941,\n \"\
acc_stderr\": 0.034919052518984244,\n \"acc_norm\": 0.5247318627818371,\n\
\ \"acc_norm_stderr\": 0.034903520327008546,\n \"mc1\": 0.35862913096695226,\n\
\ \"mc1_stderr\": 0.016789289499502025,\n \"mc2\": 0.5088488034487862,\n\
\ \"mc2_stderr\": 0.015405211397549821\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.515358361774744,\n \"acc_stderr\": 0.01460449612939491,\n\
\ \"acc_norm\": 0.5435153583617748,\n \"acc_norm_stderr\": 0.014555949760496442\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.594901414060944,\n\
\ \"acc_stderr\": 0.004899078300184252,\n \"acc_norm\": 0.7946624178450508,\n\
\ \"acc_norm_stderr\": 0.004031225342516806\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4868421052631579,\n \"acc_stderr\": 0.04067533136309173,\n\
\ \"acc_norm\": 0.4868421052631579,\n \"acc_norm_stderr\": 0.04067533136309173\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5094339622641509,\n \"acc_stderr\": 0.030767394707808093,\n\
\ \"acc_norm\": 0.5094339622641509,\n \"acc_norm_stderr\": 0.030767394707808093\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5069444444444444,\n\
\ \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.5069444444444444,\n\
\ \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4277456647398844,\n\
\ \"acc_stderr\": 0.03772446857518026,\n \"acc_norm\": 0.4277456647398844,\n\
\ \"acc_norm_stderr\": 0.03772446857518026\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179327,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179327\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4085106382978723,\n \"acc_stderr\": 0.03213418026701576,\n\
\ \"acc_norm\": 0.4085106382978723,\n \"acc_norm_stderr\": 0.03213418026701576\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.022789673145776564,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.022789673145776564\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\
\ \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n\
\ \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5870967741935483,\n \"acc_stderr\": 0.02800913812540039,\n \"\
acc_norm\": 0.5870967741935483,\n \"acc_norm_stderr\": 0.02800913812540039\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4187192118226601,\n \"acc_stderr\": 0.03471192860518468,\n \"\
acc_norm\": 0.4187192118226601,\n \"acc_norm_stderr\": 0.03471192860518468\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03681050869161549,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03681050869161549\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6616161616161617,\n \"acc_stderr\": 0.03371124142626302,\n \"\
acc_norm\": 0.6616161616161617,\n \"acc_norm_stderr\": 0.03371124142626302\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7202072538860104,\n \"acc_stderr\": 0.03239637046735704,\n\
\ \"acc_norm\": 0.7202072538860104,\n \"acc_norm_stderr\": 0.03239637046735704\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.441025641025641,\n \"acc_stderr\": 0.02517404838400076,\n \
\ \"acc_norm\": 0.441025641025641,\n \"acc_norm_stderr\": 0.02517404838400076\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895992,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895992\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4831932773109244,\n \"acc_stderr\": 0.03246013680375308,\n \
\ \"acc_norm\": 0.4831932773109244,\n \"acc_norm_stderr\": 0.03246013680375308\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119995,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119995\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7064220183486238,\n \"acc_stderr\": 0.019525151122639667,\n \"\
acc_norm\": 0.7064220183486238,\n \"acc_norm_stderr\": 0.019525151122639667\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4212962962962963,\n \"acc_stderr\": 0.03367462138896078,\n \"\
acc_norm\": 0.4212962962962963,\n \"acc_norm_stderr\": 0.03367462138896078\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.03308611113236434,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03308611113236434\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6835443037974683,\n \"acc_stderr\": 0.03027497488021898,\n \
\ \"acc_norm\": 0.6835443037974683,\n \"acc_norm_stderr\": 0.03027497488021898\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5515695067264574,\n\
\ \"acc_stderr\": 0.03337883736255098,\n \"acc_norm\": 0.5515695067264574,\n\
\ \"acc_norm_stderr\": 0.03337883736255098\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009224,\n\
\ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009224\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6859504132231405,\n \"acc_stderr\": 0.04236964753041019,\n \"\
acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.04236964753041019\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.0471282125742677,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.0471282125742677\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.03731133519673893,\n\
\ \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.03731133519673893\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.0458212416016155,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.0458212416016155\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n\
\ \"acc_stderr\": 0.025598193686652244,\n \"acc_norm\": 0.811965811965812,\n\
\ \"acc_norm_stderr\": 0.025598193686652244\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7062579821200511,\n\
\ \"acc_stderr\": 0.016287759388491665,\n \"acc_norm\": 0.7062579821200511,\n\
\ \"acc_norm_stderr\": 0.016287759388491665\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5722543352601156,\n \"acc_stderr\": 0.02663653974111608,\n\
\ \"acc_norm\": 0.5722543352601156,\n \"acc_norm_stderr\": 0.02663653974111608\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3016759776536313,\n\
\ \"acc_stderr\": 0.015350767572220286,\n \"acc_norm\": 0.3016759776536313,\n\
\ \"acc_norm_stderr\": 0.015350767572220286\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.02763417668960266,\n\
\ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.02763417668960266\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5755627009646302,\n\
\ \"acc_stderr\": 0.028071928247946208,\n \"acc_norm\": 0.5755627009646302,\n\
\ \"acc_norm_stderr\": 0.028071928247946208\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5802469135802469,\n \"acc_stderr\": 0.027460099557005135,\n\
\ \"acc_norm\": 0.5802469135802469,\n \"acc_norm_stderr\": 0.027460099557005135\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.38652482269503546,\n \"acc_stderr\": 0.02904919034254346,\n \
\ \"acc_norm\": 0.38652482269503546,\n \"acc_norm_stderr\": 0.02904919034254346\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42503259452411996,\n\
\ \"acc_stderr\": 0.012625879884891993,\n \"acc_norm\": 0.42503259452411996,\n\
\ \"acc_norm_stderr\": 0.012625879884891993\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.030352303395351964,\n\
\ \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.030352303395351964\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5196078431372549,\n \"acc_stderr\": 0.020212274976302957,\n \
\ \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.020212274976302957\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5454545454545454,\n\
\ \"acc_stderr\": 0.04769300568972744,\n \"acc_norm\": 0.5454545454545454,\n\
\ \"acc_norm_stderr\": 0.04769300568972744\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5836734693877551,\n \"acc_stderr\": 0.031557828165561644,\n\
\ \"acc_norm\": 0.5836734693877551,\n \"acc_norm_stderr\": 0.031557828165561644\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n\
\ \"acc_stderr\": 0.03152439186555402,\n \"acc_norm\": 0.7263681592039801,\n\
\ \"acc_norm_stderr\": 0.03152439186555402\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686399,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686399\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35862913096695226,\n\
\ \"mc1_stderr\": 0.016789289499502025,\n \"mc2\": 0.5088488034487862,\n\
\ \"mc2_stderr\": 0.015405211397549821\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/vicuna-13b-v1.3.0-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|arc:challenge|25_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hellaswag|10_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T17:36:46.584597.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T17:36:46.584597.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T17:36:46.584597.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T17:36:46.584597.parquet'
- config_name: results
data_files:
- split: 2023_08_29T17_36_46.584597
path:
- results_2023-08-29T17:36:46.584597.parquet
- split: latest
path:
- results_2023-08-29T17:36:46.584597.parquet
---
# Dataset Card for Evaluation run of TheBloke/vicuna-13b-v1.3.0-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/vicuna-13b-v1.3.0-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/vicuna-13b-v1.3.0-GPTQ](https://huggingface.co/TheBloke/vicuna-13b-v1.3.0-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__vicuna-13b-v1.3.0-GPTQ",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T17:36:46.584597](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__vicuna-13b-v1.3.0-GPTQ/blob/main/results_2023-08-29T17%3A36%3A46.584597.json):
```python
{
"all": {
"acc": 0.5208688458263941,
"acc_stderr": 0.034919052518984244,
"acc_norm": 0.5247318627818371,
"acc_norm_stderr": 0.034903520327008546,
"mc1": 0.35862913096695226,
"mc1_stderr": 0.016789289499502025,
"mc2": 0.5088488034487862,
"mc2_stderr": 0.015405211397549821
},
"harness|arc:challenge|25": {
"acc": 0.515358361774744,
"acc_stderr": 0.01460449612939491,
"acc_norm": 0.5435153583617748,
"acc_norm_stderr": 0.014555949760496442
},
"harness|hellaswag|10": {
"acc": 0.594901414060944,
"acc_stderr": 0.004899078300184252,
"acc_norm": 0.7946624178450508,
"acc_norm_stderr": 0.004031225342516806
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4868421052631579,
"acc_stderr": 0.04067533136309173,
"acc_norm": 0.4868421052631579,
"acc_norm_stderr": 0.04067533136309173
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5094339622641509,
"acc_stderr": 0.030767394707808093,
"acc_norm": 0.5094339622641509,
"acc_norm_stderr": 0.030767394707808093
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5069444444444444,
"acc_stderr": 0.04180806750294938,
"acc_norm": 0.5069444444444444,
"acc_norm_stderr": 0.04180806750294938
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4277456647398844,
"acc_stderr": 0.03772446857518026,
"acc_norm": 0.4277456647398844,
"acc_norm_stderr": 0.03772446857518026
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179327,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179327
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4085106382978723,
"acc_stderr": 0.03213418026701576,
"acc_norm": 0.4085106382978723,
"acc_norm_stderr": 0.03213418026701576
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.022789673145776564,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.022789673145776564
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795132,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795132
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5870967741935483,
"acc_stderr": 0.02800913812540039,
"acc_norm": 0.5870967741935483,
"acc_norm_stderr": 0.02800913812540039
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4187192118226601,
"acc_stderr": 0.03471192860518468,
"acc_norm": 0.4187192118226601,
"acc_norm_stderr": 0.03471192860518468
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03681050869161549,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03681050869161549
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6616161616161617,
"acc_stderr": 0.03371124142626302,
"acc_norm": 0.6616161616161617,
"acc_norm_stderr": 0.03371124142626302
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7202072538860104,
"acc_stderr": 0.03239637046735704,
"acc_norm": 0.7202072538860104,
"acc_norm_stderr": 0.03239637046735704
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.441025641025641,
"acc_stderr": 0.02517404838400076,
"acc_norm": 0.441025641025641,
"acc_norm_stderr": 0.02517404838400076
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.02646611753895992,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.02646611753895992
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4831932773109244,
"acc_stderr": 0.03246013680375308,
"acc_norm": 0.4831932773109244,
"acc_norm_stderr": 0.03246013680375308
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119995,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119995
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7064220183486238,
"acc_stderr": 0.019525151122639667,
"acc_norm": 0.7064220183486238,
"acc_norm_stderr": 0.019525151122639667
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4212962962962963,
"acc_stderr": 0.03367462138896078,
"acc_norm": 0.4212962962962963,
"acc_norm_stderr": 0.03367462138896078
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03308611113236434,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03308611113236434
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6835443037974683,
"acc_stderr": 0.03027497488021898,
"acc_norm": 0.6835443037974683,
"acc_norm_stderr": 0.03027497488021898
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5515695067264574,
"acc_stderr": 0.03337883736255098,
"acc_norm": 0.5515695067264574,
"acc_norm_stderr": 0.03337883736255098
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009224,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009224
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6859504132231405,
"acc_stderr": 0.04236964753041019,
"acc_norm": 0.6859504132231405,
"acc_norm_stderr": 0.04236964753041019
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.0471282125742677,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.0471282125742677
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.03731133519673893,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.03731133519673893
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.0458212416016155,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.0458212416016155
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.025598193686652244,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.025598193686652244
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7062579821200511,
"acc_stderr": 0.016287759388491665,
"acc_norm": 0.7062579821200511,
"acc_norm_stderr": 0.016287759388491665
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.02663653974111608,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.02663653974111608
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3016759776536313,
"acc_stderr": 0.015350767572220286,
"acc_norm": 0.3016759776536313,
"acc_norm_stderr": 0.015350767572220286
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.02763417668960266,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.02763417668960266
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5755627009646302,
"acc_stderr": 0.028071928247946208,
"acc_norm": 0.5755627009646302,
"acc_norm_stderr": 0.028071928247946208
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5802469135802469,
"acc_stderr": 0.027460099557005135,
"acc_norm": 0.5802469135802469,
"acc_norm_stderr": 0.027460099557005135
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.38652482269503546,
"acc_stderr": 0.02904919034254346,
"acc_norm": 0.38652482269503546,
"acc_norm_stderr": 0.02904919034254346
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42503259452411996,
"acc_stderr": 0.012625879884891993,
"acc_norm": 0.42503259452411996,
"acc_norm_stderr": 0.012625879884891993
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5183823529411765,
"acc_stderr": 0.030352303395351964,
"acc_norm": 0.5183823529411765,
"acc_norm_stderr": 0.030352303395351964
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5196078431372549,
"acc_stderr": 0.020212274976302957,
"acc_norm": 0.5196078431372549,
"acc_norm_stderr": 0.020212274976302957
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.04769300568972744,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.04769300568972744
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5836734693877551,
"acc_stderr": 0.031557828165561644,
"acc_norm": 0.5836734693877551,
"acc_norm_stderr": 0.031557828165561644
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.03152439186555402,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.03152439186555402
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686399,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686399
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35862913096695226,
"mc1_stderr": 0.016789289499502025,
"mc2": 0.5088488034487862,
"mc2_stderr": 0.015405211397549821
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
gddgdg/guanaco-llama2-1k | 2023-08-29T17:38:51.000Z | [
"region:us"
] | gddgdg | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966693
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-llama2-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Phind__Phind-CodeLlama-34B-v2 | 2023-08-29T17:47:18.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Phind/Phind-CodeLlama-34B-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Phind/Phind-CodeLlama-34B-v2](https://huggingface.co/Phind/Phind-CodeLlama-34B-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Phind__Phind-CodeLlama-34B-v2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T17:45:53.549865](https://huggingface.co/datasets/open-llm-leaderboard/details_Phind__Phind-CodeLlama-34B-v2/blob/main/results_2023-08-29T17%3A45%3A53.549865.json):\n\
\n```python\n{\n \"all\": {\n \"acc\": 0.25672068936384074,\n \"\
acc_stderr\": 0.03168747865977248,\n \"acc_norm\": 0.2577352717575166,\n\
\ \"acc_norm_stderr\": 0.03170369807637918,\n \"mc1\": 0.25458996328029376,\n\
\ \"mc1_stderr\": 0.015250117079156486,\n \"mc2\": 0.48371839177847137,\n\
\ \"mc2_stderr\": 0.016336145719705417\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.20051194539249148,\n \"acc_stderr\": 0.011700318050499378,\n\
\ \"acc_norm\": 0.24573378839590443,\n \"acc_norm_stderr\": 0.012581033453730102\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2614021111332404,\n\
\ \"acc_stderr\": 0.0043850049989234635,\n \"acc_norm\": 0.27604062935670187,\n\
\ \"acc_norm_stderr\": 0.004461235175488317\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.1259259259259259,\n\
\ \"acc_stderr\": 0.02866020527595507,\n \"acc_norm\": 0.1259259259259259,\n\
\ \"acc_norm_stderr\": 0.02866020527595507\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.21710526315789475,\n \"acc_stderr\": 0.03355045304882924,\n\
\ \"acc_norm\": 0.21710526315789475,\n \"acc_norm_stderr\": 0.03355045304882924\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2490566037735849,\n \"acc_stderr\": 0.026616482980501715,\n\
\ \"acc_norm\": 0.2490566037735849,\n \"acc_norm_stderr\": 0.026616482980501715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.037455547914624576,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.037455547914624576\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.31213872832369943,\n\
\ \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.31213872832369943,\n\
\ \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.19148936170212766,\n \"acc_stderr\": 0.025722149992637805,\n\
\ \"acc_norm\": 0.19148936170212766,\n \"acc_norm_stderr\": 0.025722149992637805\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.03600105692727771,\n\
\ \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.03600105692727771\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\
\ \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n\
\ \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.16,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.18387096774193548,\n\
\ \"acc_stderr\": 0.022037217340267836,\n \"acc_norm\": 0.18387096774193548,\n\
\ \"acc_norm_stderr\": 0.022037217340267836\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.20689655172413793,\n \"acc_stderr\": 0.02850137816789395,\n\
\ \"acc_norm\": 0.20689655172413793,\n \"acc_norm_stderr\": 0.02850137816789395\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\"\
: 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2676767676767677,\n \"acc_stderr\": 0.031544498882702866,\n \"\
acc_norm\": 0.2676767676767677,\n \"acc_norm_stderr\": 0.031544498882702866\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.23316062176165803,\n \"acc_stderr\": 0.03051611137147601,\n\
\ \"acc_norm\": 0.23316062176165803,\n \"acc_norm_stderr\": 0.03051611137147601\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.25384615384615383,\n \"acc_stderr\": 0.022066054378726253,\n\
\ \"acc_norm\": 0.25384615384615383,\n \"acc_norm_stderr\": 0.022066054378726253\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23703703703703705,\n \"acc_stderr\": 0.025928876132766118,\n \
\ \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.025928876132766118\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.36554621848739494,\n \"acc_stderr\": 0.031282177063684614,\n\
\ \"acc_norm\": 0.36554621848739494,\n \"acc_norm_stderr\": 0.031282177063684614\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658754,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658754\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23486238532110093,\n \"acc_stderr\": 0.018175110510343578,\n \"\
acc_norm\": 0.23486238532110093,\n \"acc_norm_stderr\": 0.018175110510343578\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25980392156862747,\n\
\ \"acc_stderr\": 0.030778554678693257,\n \"acc_norm\": 0.25980392156862747,\n\
\ \"acc_norm_stderr\": 0.030778554678693257\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.21524663677130046,\n\
\ \"acc_stderr\": 0.02758406660220827,\n \"acc_norm\": 0.21524663677130046,\n\
\ \"acc_norm_stderr\": 0.02758406660220827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.039153454088478354,\n\
\ \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.039153454088478354\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2231404958677686,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.2231404958677686,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25153374233128833,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.25153374233128833,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.04157751539865629,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.04157751539865629\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3106796116504854,\n \"acc_stderr\": 0.04582124160161549,\n\
\ \"acc_norm\": 0.3106796116504854,\n \"acc_norm_stderr\": 0.04582124160161549\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.26495726495726496,\n\
\ \"acc_stderr\": 0.028911208802749486,\n \"acc_norm\": 0.26495726495726496,\n\
\ \"acc_norm_stderr\": 0.028911208802749486\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2260536398467433,\n\
\ \"acc_stderr\": 0.014957458504335833,\n \"acc_norm\": 0.2260536398467433,\n\
\ \"acc_norm_stderr\": 0.014957458504335833\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.022075709251757187,\n\
\ \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.022075709251757187\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26145251396648045,\n\
\ \"acc_stderr\": 0.014696599650364546,\n \"acc_norm\": 0.26145251396648045,\n\
\ \"acc_norm_stderr\": 0.014696599650364546\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24183006535947713,\n \"acc_stderr\": 0.024518195641879337,\n\
\ \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.024518195641879337\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.28938906752411575,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.28938906752411575,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.23148148148148148,\n \"acc_stderr\": 0.023468429832451156,\n\
\ \"acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.023468429832451156\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.22695035460992907,\n \"acc_stderr\": 0.024987106365642966,\n \
\ \"acc_norm\": 0.22695035460992907,\n \"acc_norm_stderr\": 0.024987106365642966\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2438070404172099,\n\
\ \"acc_stderr\": 0.010966507972178475,\n \"acc_norm\": 0.2438070404172099,\n\
\ \"acc_norm_stderr\": 0.010966507972178475\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.31985294117647056,\n \"acc_stderr\": 0.028332959514031225,\n\
\ \"acc_norm\": 0.31985294117647056,\n \"acc_norm_stderr\": 0.028332959514031225\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24673202614379086,\n \"acc_stderr\": 0.017440820367402493,\n \
\ \"acc_norm\": 0.24673202614379086,\n \"acc_norm_stderr\": 0.017440820367402493\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n\
\ \"acc_stderr\": 0.03895091015724136,\n \"acc_norm\": 0.20909090909090908,\n\
\ \"acc_norm_stderr\": 0.03895091015724136\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3142857142857143,\n \"acc_stderr\": 0.029719329422417468,\n\
\ \"acc_norm\": 0.3142857142857143,\n \"acc_norm_stderr\": 0.029719329422417468\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.208955223880597,\n\
\ \"acc_stderr\": 0.028748298931728658,\n \"acc_norm\": 0.208955223880597,\n\
\ \"acc_norm_stderr\": 0.028748298931728658\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.27710843373493976,\n\
\ \"acc_stderr\": 0.034843315926805875,\n \"acc_norm\": 0.27710843373493976,\n\
\ \"acc_norm_stderr\": 0.034843315926805875\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.23976608187134502,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.23976608187134502,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25458996328029376,\n\
\ \"mc1_stderr\": 0.015250117079156486,\n \"mc2\": 0.48371839177847137,\n\
\ \"mc2_stderr\": 0.016336145719705417\n }\n}\n```"
repo_url: https://huggingface.co/Phind/Phind-CodeLlama-34B-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|arc:challenge|25_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hellaswag|10_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T17:45:53.549865.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T17:45:53.549865.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T17:45:53.549865.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T17:45:53.549865.parquet'
- config_name: results
data_files:
- split: 2023_08_29T17_45_53.549865
path:
- results_2023-08-29T17:45:53.549865.parquet
- split: latest
path:
- results_2023-08-29T17:45:53.549865.parquet
---
# Dataset Card for Evaluation run of Phind/Phind-CodeLlama-34B-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Phind/Phind-CodeLlama-34B-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Phind/Phind-CodeLlama-34B-v2](https://huggingface.co/Phind/Phind-CodeLlama-34B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Phind__Phind-CodeLlama-34B-v2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T17:45:53.549865](https://huggingface.co/datasets/open-llm-leaderboard/details_Phind__Phind-CodeLlama-34B-v2/blob/main/results_2023-08-29T17%3A45%3A53.549865.json):
```python
{
"all": {
"acc": 0.25672068936384074,
"acc_stderr": 0.03168747865977248,
"acc_norm": 0.2577352717575166,
"acc_norm_stderr": 0.03170369807637918,
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156486,
"mc2": 0.48371839177847137,
"mc2_stderr": 0.016336145719705417
},
"harness|arc:challenge|25": {
"acc": 0.20051194539249148,
"acc_stderr": 0.011700318050499378,
"acc_norm": 0.24573378839590443,
"acc_norm_stderr": 0.012581033453730102
},
"harness|hellaswag|10": {
"acc": 0.2614021111332404,
"acc_stderr": 0.0043850049989234635,
"acc_norm": 0.27604062935670187,
"acc_norm_stderr": 0.004461235175488317
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.1259259259259259,
"acc_stderr": 0.02866020527595507,
"acc_norm": 0.1259259259259259,
"acc_norm_stderr": 0.02866020527595507
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21710526315789475,
"acc_stderr": 0.03355045304882924,
"acc_norm": 0.21710526315789475,
"acc_norm_stderr": 0.03355045304882924
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2490566037735849,
"acc_stderr": 0.026616482980501715,
"acc_norm": 0.2490566037735849,
"acc_norm_stderr": 0.026616482980501715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.037455547914624576,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.037455547914624576
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.31213872832369943,
"acc_stderr": 0.035331333893236574,
"acc_norm": 0.31213872832369943,
"acc_norm_stderr": 0.035331333893236574
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.19148936170212766,
"acc_stderr": 0.025722149992637805,
"acc_norm": 0.19148936170212766,
"acc_norm_stderr": 0.025722149992637805
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.03600105692727771,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.03600105692727771
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795132,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795132
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.16,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.16,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.18387096774193548,
"acc_stderr": 0.022037217340267836,
"acc_norm": 0.18387096774193548,
"acc_norm_stderr": 0.022037217340267836
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.20689655172413793,
"acc_stderr": 0.02850137816789395,
"acc_norm": 0.20689655172413793,
"acc_norm_stderr": 0.02850137816789395
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2676767676767677,
"acc_stderr": 0.031544498882702866,
"acc_norm": 0.2676767676767677,
"acc_norm_stderr": 0.031544498882702866
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.23316062176165803,
"acc_stderr": 0.03051611137147601,
"acc_norm": 0.23316062176165803,
"acc_norm_stderr": 0.03051611137147601
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.25384615384615383,
"acc_stderr": 0.022066054378726253,
"acc_norm": 0.25384615384615383,
"acc_norm_stderr": 0.022066054378726253
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.025928876132766118,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.025928876132766118
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.36554621848739494,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.36554621848739494,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658754,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23486238532110093,
"acc_stderr": 0.018175110510343578,
"acc_norm": 0.23486238532110093,
"acc_norm_stderr": 0.018175110510343578
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.030778554678693257,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.030778554678693257
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.21524663677130046,
"acc_stderr": 0.02758406660220827,
"acc_norm": 0.21524663677130046,
"acc_norm_stderr": 0.02758406660220827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2231404958677686,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.2231404958677686,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25153374233128833,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.25153374233128833,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.04157751539865629,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.04157751539865629
},
"harness|hendrycksTest-management|5": {
"acc": 0.3106796116504854,
"acc_stderr": 0.04582124160161549,
"acc_norm": 0.3106796116504854,
"acc_norm_stderr": 0.04582124160161549
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.26495726495726496,
"acc_stderr": 0.028911208802749486,
"acc_norm": 0.26495726495726496,
"acc_norm_stderr": 0.028911208802749486
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2260536398467433,
"acc_stderr": 0.014957458504335833,
"acc_norm": 0.2260536398467433,
"acc_norm_stderr": 0.014957458504335833
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.022075709251757187,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.022075709251757187
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26145251396648045,
"acc_stderr": 0.014696599650364546,
"acc_norm": 0.26145251396648045,
"acc_norm_stderr": 0.014696599650364546
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24183006535947713,
"acc_stderr": 0.024518195641879337,
"acc_norm": 0.24183006535947713,
"acc_norm_stderr": 0.024518195641879337
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.28938906752411575,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.28938906752411575,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.023468429832451156,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.023468429832451156
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.22695035460992907,
"acc_stderr": 0.024987106365642966,
"acc_norm": 0.22695035460992907,
"acc_norm_stderr": 0.024987106365642966
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2438070404172099,
"acc_stderr": 0.010966507972178475,
"acc_norm": 0.2438070404172099,
"acc_norm_stderr": 0.010966507972178475
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.31985294117647056,
"acc_stderr": 0.028332959514031225,
"acc_norm": 0.31985294117647056,
"acc_norm_stderr": 0.028332959514031225
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24673202614379086,
"acc_stderr": 0.017440820367402493,
"acc_norm": 0.24673202614379086,
"acc_norm_stderr": 0.017440820367402493
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.03895091015724136,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.03895091015724136
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3142857142857143,
"acc_stderr": 0.029719329422417468,
"acc_norm": 0.3142857142857143,
"acc_norm_stderr": 0.029719329422417468
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.208955223880597,
"acc_stderr": 0.028748298931728658,
"acc_norm": 0.208955223880597,
"acc_norm_stderr": 0.028748298931728658
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-virology|5": {
"acc": 0.27710843373493976,
"acc_stderr": 0.034843315926805875,
"acc_norm": 0.27710843373493976,
"acc_norm_stderr": 0.034843315926805875
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.23976608187134502,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.23976608187134502,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156486,
"mc2": 0.48371839177847137,
"mc2_stderr": 0.016336145719705417
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mrmocciai/mrmocci | 2023-10-06T14:20:47.000Z | [
"language:en",
"license:mit",
"code",
"region:us"
] | mrmocciai | null | null | null | 0 | 0 | ---
license: mit
language:
- en
tags:
- code
---
<div align="center">
<b> VOICE CONVERSATION BACKUP</b><br />
[Original Repo](https://github.com/RVC-Project/Retrieval-based-Voice-Conversion-WebUI) |
aditijha/instruct_v3_subset_2 | 2023-08-29T17:58:41.000Z | [
"region:us"
] | aditijha | null | null | null | 1 | 0 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 3930962.2554168818
num_examples: 1000
download_size: 2048066
dataset_size: 3930962.2554168818
---
# Dataset Card for "instruct_v3_subset_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_NousResearch__CodeLlama-13b-hf | 2023-08-29T18:15:16.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of NousResearch/CodeLlama-13b-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NousResearch/CodeLlama-13b-hf](https://huggingface.co/NousResearch/CodeLlama-13b-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NousResearch__CodeLlama-13b-hf\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T18:13:52.290314](https://huggingface.co/datasets/open-llm-leaderboard/details_NousResearch__CodeLlama-13b-hf/blob/main/results_2023-08-29T18%3A13%3A52.290314.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.33133305646279027,\n\
\ \"acc_stderr\": 0.033819526019171764,\n \"acc_norm\": 0.3346318411229531,\n\
\ \"acc_norm_stderr\": 0.03381993357374892,\n \"mc1\": 0.2778457772337821,\n\
\ \"mc1_stderr\": 0.01568092936402465,\n \"mc2\": 0.43794269776602796,\n\
\ \"mc2_stderr\": 0.01446900625927817\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3779863481228669,\n \"acc_stderr\": 0.014169664520303101,\n\
\ \"acc_norm\": 0.4087030716723549,\n \"acc_norm_stderr\": 0.014365750345427006\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.469627564230233,\n\
\ \"acc_stderr\": 0.004980566907790459,\n \"acc_norm\": 0.6335391356303525,\n\
\ \"acc_norm_stderr\": 0.0048085268027185865\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.32592592592592595,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.32592592592592595,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.29605263157894735,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.29605263157894735,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n\
\ \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3018867924528302,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.3018867924528302,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.22916666666666666,\n\
\ \"acc_stderr\": 0.035146974678623884,\n \"acc_norm\": 0.22916666666666666,\n\
\ \"acc_norm_stderr\": 0.035146974678623884\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.30057803468208094,\n\
\ \"acc_stderr\": 0.0349610148119118,\n \"acc_norm\": 0.30057803468208094,\n\
\ \"acc_norm_stderr\": 0.0349610148119118\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006718,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006718\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.33617021276595743,\n \"acc_stderr\": 0.030881618520676942,\n\
\ \"acc_norm\": 0.33617021276595743,\n \"acc_norm_stderr\": 0.030881618520676942\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.26455026455026454,\n \"acc_stderr\": 0.02271746789770861,\n \"\
acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.02271746789770861\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.037184890068181146,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.037184890068181146\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3419354838709677,\n\
\ \"acc_stderr\": 0.026985289576552725,\n \"acc_norm\": 0.3419354838709677,\n\
\ \"acc_norm_stderr\": 0.026985289576552725\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2660098522167488,\n \"acc_stderr\": 0.031089826002937523,\n\
\ \"acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.031089826002937523\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\"\
: 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.03477691162163659,\n\
\ \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.03477691162163659\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4696969696969697,\n \"acc_stderr\": 0.03555804051763929,\n \"\
acc_norm\": 0.4696969696969697,\n \"acc_norm_stderr\": 0.03555804051763929\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.47150259067357514,\n \"acc_stderr\": 0.036025735712884414,\n\
\ \"acc_norm\": 0.47150259067357514,\n \"acc_norm_stderr\": 0.036025735712884414\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.34102564102564104,\n \"acc_stderr\": 0.024035489676335065,\n\
\ \"acc_norm\": 0.34102564102564104,\n \"acc_norm_stderr\": 0.024035489676335065\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507383,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507383\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3445378151260504,\n \"acc_stderr\": 0.030868682604121626,\n\
\ \"acc_norm\": 0.3445378151260504,\n \"acc_norm_stderr\": 0.030868682604121626\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23841059602649006,\n \"acc_stderr\": 0.0347918557259966,\n \"\
acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.0347918557259966\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3192660550458716,\n \"acc_stderr\": 0.019987829069750017,\n \"\
acc_norm\": 0.3192660550458716,\n \"acc_norm_stderr\": 0.019987829069750017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.034076320938540516,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.034076320938540516\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.3088235294117647,\n \"acc_stderr\": 0.03242661719827218,\n \"\
acc_norm\": 0.3088235294117647,\n \"acc_norm_stderr\": 0.03242661719827218\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3628691983122363,\n \"acc_stderr\": 0.03129920825530213,\n \
\ \"acc_norm\": 0.3628691983122363,\n \"acc_norm_stderr\": 0.03129920825530213\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3901345291479821,\n\
\ \"acc_stderr\": 0.03273766725459157,\n \"acc_norm\": 0.3901345291479821,\n\
\ \"acc_norm_stderr\": 0.03273766725459157\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.35877862595419846,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.35877862595419846,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.256198347107438,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3148148148148148,\n\
\ \"acc_stderr\": 0.04489931073591311,\n \"acc_norm\": 0.3148148148148148,\n\
\ \"acc_norm_stderr\": 0.04489931073591311\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04109974682633932,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04109974682633932\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3883495145631068,\n \"acc_stderr\": 0.04825729337356391,\n\
\ \"acc_norm\": 0.3883495145631068,\n \"acc_norm_stderr\": 0.04825729337356391\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5213675213675214,\n\
\ \"acc_stderr\": 0.032726164476349545,\n \"acc_norm\": 0.5213675213675214,\n\
\ \"acc_norm_stderr\": 0.032726164476349545\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.438058748403576,\n\
\ \"acc_stderr\": 0.017742232238257223,\n \"acc_norm\": 0.438058748403576,\n\
\ \"acc_norm_stderr\": 0.017742232238257223\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.21676300578034682,\n \"acc_stderr\": 0.02218347766841286,\n\
\ \"acc_norm\": 0.21676300578034682,\n \"acc_norm_stderr\": 0.02218347766841286\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2536312849162011,\n\
\ \"acc_stderr\": 0.014551553659369918,\n \"acc_norm\": 0.2536312849162011,\n\
\ \"acc_norm_stderr\": 0.014551553659369918\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.027184498909941613,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.027184498909941613\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3890675241157556,\n\
\ \"acc_stderr\": 0.027690337536485376,\n \"acc_norm\": 0.3890675241157556,\n\
\ \"acc_norm_stderr\": 0.027690337536485376\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.36728395061728397,\n \"acc_stderr\": 0.026822801759507894,\n\
\ \"acc_norm\": 0.36728395061728397,\n \"acc_norm_stderr\": 0.026822801759507894\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2695035460992908,\n \"acc_stderr\": 0.02646903681859063,\n \
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.02646903681859063\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2966101694915254,\n\
\ \"acc_stderr\": 0.011665946586082838,\n \"acc_norm\": 0.2966101694915254,\n\
\ \"acc_norm_stderr\": 0.011665946586082838\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.30514705882352944,\n \"acc_stderr\": 0.027971541370170605,\n\
\ \"acc_norm\": 0.30514705882352944,\n \"acc_norm_stderr\": 0.027971541370170605\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.27941176470588236,\n \"acc_stderr\": 0.018152871051538816,\n \
\ \"acc_norm\": 0.27941176470588236,\n \"acc_norm_stderr\": 0.018152871051538816\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3224489795918367,\n \"acc_stderr\": 0.02992310056368391,\n\
\ \"acc_norm\": 0.3224489795918367,\n \"acc_norm_stderr\": 0.02992310056368391\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.417910447761194,\n\
\ \"acc_stderr\": 0.034875586404620636,\n \"acc_norm\": 0.417910447761194,\n\
\ \"acc_norm_stderr\": 0.034875586404620636\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3493975903614458,\n\
\ \"acc_stderr\": 0.0371172519074075,\n \"acc_norm\": 0.3493975903614458,\n\
\ \"acc_norm_stderr\": 0.0371172519074075\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4327485380116959,\n \"acc_stderr\": 0.03799978644370607,\n\
\ \"acc_norm\": 0.4327485380116959,\n \"acc_norm_stderr\": 0.03799978644370607\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2778457772337821,\n\
\ \"mc1_stderr\": 0.01568092936402465,\n \"mc2\": 0.43794269776602796,\n\
\ \"mc2_stderr\": 0.01446900625927817\n }\n}\n```"
repo_url: https://huggingface.co/NousResearch/CodeLlama-13b-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|arc:challenge|25_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hellaswag|10_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T18:13:52.290314.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T18:13:52.290314.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T18:13:52.290314.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T18:13:52.290314.parquet'
- config_name: results
data_files:
- split: 2023_08_29T18_13_52.290314
path:
- results_2023-08-29T18:13:52.290314.parquet
- split: latest
path:
- results_2023-08-29T18:13:52.290314.parquet
---
# Dataset Card for Evaluation run of NousResearch/CodeLlama-13b-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NousResearch/CodeLlama-13b-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NousResearch/CodeLlama-13b-hf](https://huggingface.co/NousResearch/CodeLlama-13b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NousResearch__CodeLlama-13b-hf",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T18:13:52.290314](https://huggingface.co/datasets/open-llm-leaderboard/details_NousResearch__CodeLlama-13b-hf/blob/main/results_2023-08-29T18%3A13%3A52.290314.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.33133305646279027,
"acc_stderr": 0.033819526019171764,
"acc_norm": 0.3346318411229531,
"acc_norm_stderr": 0.03381993357374892,
"mc1": 0.2778457772337821,
"mc1_stderr": 0.01568092936402465,
"mc2": 0.43794269776602796,
"mc2_stderr": 0.01446900625927817
},
"harness|arc:challenge|25": {
"acc": 0.3779863481228669,
"acc_stderr": 0.014169664520303101,
"acc_norm": 0.4087030716723549,
"acc_norm_stderr": 0.014365750345427006
},
"harness|hellaswag|10": {
"acc": 0.469627564230233,
"acc_stderr": 0.004980566907790459,
"acc_norm": 0.6335391356303525,
"acc_norm_stderr": 0.0048085268027185865
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.29605263157894735,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.29605263157894735,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3018867924528302,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.3018867924528302,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.22916666666666666,
"acc_stderr": 0.035146974678623884,
"acc_norm": 0.22916666666666666,
"acc_norm_stderr": 0.035146974678623884
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.30057803468208094,
"acc_stderr": 0.0349610148119118,
"acc_norm": 0.30057803468208094,
"acc_norm_stderr": 0.0349610148119118
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006718,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006718
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.33617021276595743,
"acc_stderr": 0.030881618520676942,
"acc_norm": 0.33617021276595743,
"acc_norm_stderr": 0.030881618520676942
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.02271746789770861,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.02271746789770861
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.037184890068181146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.037184890068181146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3419354838709677,
"acc_stderr": 0.026985289576552725,
"acc_norm": 0.3419354838709677,
"acc_norm_stderr": 0.026985289576552725
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2660098522167488,
"acc_stderr": 0.031089826002937523,
"acc_norm": 0.2660098522167488,
"acc_norm_stderr": 0.031089826002937523
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.03477691162163659,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.03477691162163659
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4696969696969697,
"acc_stderr": 0.03555804051763929,
"acc_norm": 0.4696969696969697,
"acc_norm_stderr": 0.03555804051763929
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.47150259067357514,
"acc_stderr": 0.036025735712884414,
"acc_norm": 0.47150259067357514,
"acc_norm_stderr": 0.036025735712884414
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.34102564102564104,
"acc_stderr": 0.024035489676335065,
"acc_norm": 0.34102564102564104,
"acc_norm_stderr": 0.024035489676335065
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507383,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507383
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3445378151260504,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.3445378151260504,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.0347918557259966,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.0347918557259966
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3192660550458716,
"acc_stderr": 0.019987829069750017,
"acc_norm": 0.3192660550458716,
"acc_norm_stderr": 0.019987829069750017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3088235294117647,
"acc_stderr": 0.03242661719827218,
"acc_norm": 0.3088235294117647,
"acc_norm_stderr": 0.03242661719827218
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3628691983122363,
"acc_stderr": 0.03129920825530213,
"acc_norm": 0.3628691983122363,
"acc_norm_stderr": 0.03129920825530213
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3901345291479821,
"acc_stderr": 0.03273766725459157,
"acc_norm": 0.3901345291479821,
"acc_norm_stderr": 0.03273766725459157
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.35877862595419846,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.35877862595419846,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.04489931073591311,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.04489931073591311
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25,
"acc_stderr": 0.04109974682633932,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04109974682633932
},
"harness|hendrycksTest-management|5": {
"acc": 0.3883495145631068,
"acc_stderr": 0.04825729337356391,
"acc_norm": 0.3883495145631068,
"acc_norm_stderr": 0.04825729337356391
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5213675213675214,
"acc_stderr": 0.032726164476349545,
"acc_norm": 0.5213675213675214,
"acc_norm_stderr": 0.032726164476349545
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.438058748403576,
"acc_stderr": 0.017742232238257223,
"acc_norm": 0.438058748403576,
"acc_norm_stderr": 0.017742232238257223
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.21676300578034682,
"acc_stderr": 0.02218347766841286,
"acc_norm": 0.21676300578034682,
"acc_norm_stderr": 0.02218347766841286
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2536312849162011,
"acc_stderr": 0.014551553659369918,
"acc_norm": 0.2536312849162011,
"acc_norm_stderr": 0.014551553659369918
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.027184498909941613,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.027184498909941613
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3890675241157556,
"acc_stderr": 0.027690337536485376,
"acc_norm": 0.3890675241157556,
"acc_norm_stderr": 0.027690337536485376
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.36728395061728397,
"acc_stderr": 0.026822801759507894,
"acc_norm": 0.36728395061728397,
"acc_norm_stderr": 0.026822801759507894
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.02646903681859063,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.02646903681859063
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2966101694915254,
"acc_stderr": 0.011665946586082838,
"acc_norm": 0.2966101694915254,
"acc_norm_stderr": 0.011665946586082838
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.30514705882352944,
"acc_stderr": 0.027971541370170605,
"acc_norm": 0.30514705882352944,
"acc_norm_stderr": 0.027971541370170605
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.27941176470588236,
"acc_stderr": 0.018152871051538816,
"acc_norm": 0.27941176470588236,
"acc_norm_stderr": 0.018152871051538816
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3,
"acc_stderr": 0.04389311454644286,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04389311454644286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3224489795918367,
"acc_stderr": 0.02992310056368391,
"acc_norm": 0.3224489795918367,
"acc_norm_stderr": 0.02992310056368391
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.417910447761194,
"acc_stderr": 0.034875586404620636,
"acc_norm": 0.417910447761194,
"acc_norm_stderr": 0.034875586404620636
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3493975903614458,
"acc_stderr": 0.0371172519074075,
"acc_norm": 0.3493975903614458,
"acc_norm_stderr": 0.0371172519074075
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4327485380116959,
"acc_stderr": 0.03799978644370607,
"acc_norm": 0.4327485380116959,
"acc_norm_stderr": 0.03799978644370607
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2778457772337821,
"mc1_stderr": 0.01568092936402465,
"mc2": 0.43794269776602796,
"mc2_stderr": 0.01446900625927817
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
vikas-mehta-cohere-health/sample | 2023-08-29T18:14:58.000Z | [
"size_categories:n<1K",
"rlfh",
"argilla",
"human-feedback",
"region:us"
] | vikas-mehta-cohere-health | null | null | null | 0 | 0 | ---
size_categories: n<1K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for sample
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("vikas-mehta-cohere-health/sample")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("vikas-mehta-cohere-health/sample")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/guides/llms/conceptual_guides/data_model.html) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are suppported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| text | Text | TextField | True | False |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, single choice, or multiple choice.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| sentiment | Sentiment | LabelQuestion | True | N/A | ['positive', 'neutral', 'negative'] |
| mixed-emotion | Mixed-emotion | MultiLabelQuestion | True | N/A | ['joy', 'anger', 'sadness', 'fear', 'surprise', 'love'] |
**✨ NEW** Additionally, we also have **suggestions**, which are linked to the existing questions, and so on, named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above.
Finally, the **guidelines** are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"fields": {
"text": "i didnt feel humiliated"
},
"metadata": {},
"responses": [
{
"status": "submitted",
"user_id": "ca1b15e8-a86c-4cdf-8783-45d3ee4912f4",
"values": {
"mixed-emotion": {
"value": [
"fear",
"surprise"
]
},
"sentiment": {
"value": "positive"
}
}
}
],
"suggestions": []
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"external_id": null,
"metadata": "{}",
"mixed-emotion": [
{
"status": "submitted",
"user_id": "ca1b15e8-a86c-4cdf-8783-45d3ee4912f4",
"value": [
"fear",
"surprise"
]
}
],
"mixed-emotion-suggestion": null,
"mixed-emotion-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"sentiment": [
{
"status": "submitted",
"user_id": "ca1b15e8-a86c-4cdf-8783-45d3ee4912f4",
"value": "positive"
}
],
"sentiment-suggestion": null,
"sentiment-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"text": "i didnt feel humiliated"
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are suppported. These are the ones that will be used to provide responses to the questions.
* **text** is of type `TextField`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **sentiment** is of type `LabelQuestion` with the following allowed values ['positive', 'neutral', 'negative'].
* **mixed-emotion** is of type `MultiLabelQuestion` with the following allowed values ['joy', 'anger', 'sadness', 'fear', 'surprise', 'love'].
* **✨ NEW** **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **sentiment-suggestion** is of type `label_selection` with the following allowed values ['positive', 'neutral', 'negative'].
* (optional) **mixed-emotion-suggestion** is of type `multi_label_selection` with the following allowed values ['joy', 'anger', 'sadness', 'fear', 'surprise', 'love'].
Additionally, we also have one more field which is optional and is the following:
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
Emotion is a dataset of English Twitter messages with six basic emotions: anger, fear, joy, love, sadness, and surprise.
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_NobodyExistsOnTheInternet__PuffedLIMA13bQLORA | 2023-09-22T19:42:05.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of NobodyExistsOnTheInternet/PuffedLIMA13bQLORA
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NobodyExistsOnTheInternet/PuffedLIMA13bQLORA](https://huggingface.co/NobodyExistsOnTheInternet/PuffedLIMA13bQLORA)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NobodyExistsOnTheInternet__PuffedLIMA13bQLORA\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T19:41:53.265233](https://huggingface.co/datasets/open-llm-leaderboard/details_NobodyExistsOnTheInternet__PuffedLIMA13bQLORA/blob/main/results_2023-09-22T19-41-53.265233.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.024853187919463088,\n\
\ \"em_stderr\": 0.0015942840667017492,\n \"f1\": 0.0820931208053691,\n\
\ \"f1_stderr\": 0.0019961777964585216,\n \"acc\": 0.4196788722651694,\n\
\ \"acc_stderr\": 0.009952538718324454\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.024853187919463088,\n \"em_stderr\": 0.0015942840667017492,\n\
\ \"f1\": 0.0820931208053691,\n \"f1_stderr\": 0.0019961777964585216\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08718726307808947,\n \
\ \"acc_stderr\": 0.0077706914167835605\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7521704814522494,\n \"acc_stderr\": 0.012134386019865348\n\
\ }\n}\n```"
repo_url: https://huggingface.co/NobodyExistsOnTheInternet/PuffedLIMA13bQLORA
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|arc:challenge|25_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T19_41_53.265233
path:
- '**/details_harness|drop|3_2023-09-22T19-41-53.265233.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T19-41-53.265233.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T19_41_53.265233
path:
- '**/details_harness|gsm8k|5_2023-09-22T19-41-53.265233.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T19-41-53.265233.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hellaswag|10_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T18:14:34.642776.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T18:14:34.642776.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T18:14:34.642776.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T19_41_53.265233
path:
- '**/details_harness|winogrande|5_2023-09-22T19-41-53.265233.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T19-41-53.265233.parquet'
- config_name: results
data_files:
- split: 2023_08_29T18_14_34.642776
path:
- results_2023-08-29T18:14:34.642776.parquet
- split: 2023_09_22T19_41_53.265233
path:
- results_2023-09-22T19-41-53.265233.parquet
- split: latest
path:
- results_2023-09-22T19-41-53.265233.parquet
---
# Dataset Card for Evaluation run of NobodyExistsOnTheInternet/PuffedLIMA13bQLORA
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NobodyExistsOnTheInternet/PuffedLIMA13bQLORA
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NobodyExistsOnTheInternet/PuffedLIMA13bQLORA](https://huggingface.co/NobodyExistsOnTheInternet/PuffedLIMA13bQLORA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NobodyExistsOnTheInternet__PuffedLIMA13bQLORA",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T19:41:53.265233](https://huggingface.co/datasets/open-llm-leaderboard/details_NobodyExistsOnTheInternet__PuffedLIMA13bQLORA/blob/main/results_2023-09-22T19-41-53.265233.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.024853187919463088,
"em_stderr": 0.0015942840667017492,
"f1": 0.0820931208053691,
"f1_stderr": 0.0019961777964585216,
"acc": 0.4196788722651694,
"acc_stderr": 0.009952538718324454
},
"harness|drop|3": {
"em": 0.024853187919463088,
"em_stderr": 0.0015942840667017492,
"f1": 0.0820931208053691,
"f1_stderr": 0.0019961777964585216
},
"harness|gsm8k|5": {
"acc": 0.08718726307808947,
"acc_stderr": 0.0077706914167835605
},
"harness|winogrande|5": {
"acc": 0.7521704814522494,
"acc_stderr": 0.012134386019865348
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
linhtran92/result_with_w2v2_baseline | 2023-08-29T18:19:57.000Z | [
"region:us"
] | linhtran92 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: id
dtype: string
- name: w2v2_baseline_transcription
dtype: string
- name: w2v2_baseline_norm
dtype: string
splits:
- name: train
num_bytes: 174371743.625
num_examples: 1299
download_size: 164231284
dataset_size: 174371743.625
---
# Dataset Card for "result_with_w2v2_baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Raf7801/proustlm | 2023-08-30T14:27:09.000Z | [
"region:us"
] | Raf7801 | null | null | null | 0 | 0 | Entry not found |
Q-bert/LLaVa-Llama-7b-tokenized | 2023-08-29T19:01:26.000Z | [
"license:mit",
"region:us"
] | Q-bert | null | null | null | 0 | 0 | ---
license: mit
---
image_processor = AutoImageProcessor.from_pretrained("google/vit-base-patch16-224")
tokenizer = AutoTokenizer.from_pretrained("huggyllama/llama-7b") |
yzhuang/autotree_automl_house_16H_sgosdt_l256_d3_sd0 | 2023-08-30T21:12:56.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 267120000
num_examples: 10000
- name: validation
num_bytes: 267120000
num_examples: 10000
download_size: 229828254
dataset_size: 534240000
---
# Dataset Card for "autotree_automl_house_16H_sgosdt_l256_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
csupiisc/guanaco-llama2-1k | 2023-08-29T19:01:56.000Z | [
"region:us"
] | csupiisc | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 6656
num_examples: 8
download_size: 6982
dataset_size: 6656
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-llama2-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_nicholasKluge__Aira-124M | 2023-08-29T19:05:51.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of nicholasKluge/Aira-124M
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nicholasKluge/Aira-124M](https://huggingface.co/nicholasKluge/Aira-124M) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nicholasKluge__Aira-124M\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T19:04:35.532451](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-124M/blob/main/results_2023-08-29T19%3A04%3A35.532451.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25265346552614076,\n\
\ \"acc_stderr\": 0.03117857003137413,\n \"acc_norm\": 0.253799928797708,\n\
\ \"acc_norm_stderr\": 0.03119563902907945,\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.01507721920066259,\n \"mc2\": 0.41020465472810524,\n\
\ \"mc2_stderr\": 0.015012374839842264\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.19880546075085323,\n \"acc_stderr\": 0.01166285019817554,\n\
\ \"acc_norm\": 0.24573378839590443,\n \"acc_norm_stderr\": 0.012581033453730107\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2921728739294961,\n\
\ \"acc_stderr\": 0.004538319464111971,\n \"acc_norm\": 0.312885879306911,\n\
\ \"acc_norm_stderr\": 0.004627207073171273\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2074074074074074,\n\
\ \"acc_stderr\": 0.03502553170678316,\n \"acc_norm\": 0.2074074074074074,\n\
\ \"acc_norm_stderr\": 0.03502553170678316\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21132075471698114,\n \"acc_stderr\": 0.025125766484827845,\n\
\ \"acc_norm\": 0.21132075471698114,\n \"acc_norm_stderr\": 0.025125766484827845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\"\
: 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n\
\ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n\
\ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.04096985139843671,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.04096985139843671\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.036646663372252565,\n\
\ \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.036646663372252565\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24867724867724866,\n \"acc_stderr\": 0.022261817692400168,\n \"\
acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.022261817692400168\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.16666666666666666,\n\
\ \"acc_stderr\": 0.03333333333333337,\n \"acc_norm\": 0.16666666666666666,\n\
\ \"acc_norm_stderr\": 0.03333333333333337\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.13,\n \"acc_stderr\": 0.03379976689896309,\n \
\ \"acc_norm\": 0.13,\n \"acc_norm_stderr\": 0.03379976689896309\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.21935483870967742,\n\
\ \"acc_stderr\": 0.023540799358723285,\n \"acc_norm\": 0.21935483870967742,\n\
\ \"acc_norm_stderr\": 0.023540799358723285\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.19704433497536947,\n \"acc_stderr\": 0.02798672466673622,\n\
\ \"acc_norm\": 0.19704433497536947,\n \"acc_norm_stderr\": 0.02798672466673622\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21212121212121213,\n \"acc_stderr\": 0.03192271569548299,\n\
\ \"acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.03192271569548299\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.35353535353535354,\n \"acc_stderr\": 0.03406086723547153,\n \"\
acc_norm\": 0.35353535353535354,\n \"acc_norm_stderr\": 0.03406086723547153\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n\
\ \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.30256410256410254,\n \"acc_stderr\": 0.023290888053772725,\n\
\ \"acc_norm\": 0.30256410256410254,\n \"acc_norm_stderr\": 0.023290888053772725\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.22962962962962963,\n \"acc_stderr\": 0.02564410863926763,\n \
\ \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.02564410863926763\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868966,\n\
\ \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868966\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2185430463576159,\n \"acc_stderr\": 0.03374235550425694,\n \"\
acc_norm\": 0.2185430463576159,\n \"acc_norm_stderr\": 0.03374235550425694\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3467889908256881,\n \"acc_stderr\": 0.020406097104093027,\n \"\
acc_norm\": 0.3467889908256881,\n \"acc_norm_stderr\": 0.020406097104093027\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.27941176470588236,\n\
\ \"acc_stderr\": 0.031493281045079556,\n \"acc_norm\": 0.27941176470588236,\n\
\ \"acc_norm_stderr\": 0.031493281045079556\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.28270042194092826,\n \"acc_stderr\": 0.02931281415395594,\n\
\ \"acc_norm\": 0.28270042194092826,\n \"acc_norm_stderr\": 0.02931281415395594\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.1210762331838565,\n\
\ \"acc_stderr\": 0.021894174113185737,\n \"acc_norm\": 0.1210762331838565,\n\
\ \"acc_norm_stderr\": 0.021894174113185737\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2727272727272727,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26993865030674846,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.26993865030674846,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.04246624336697624,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.04246624336697624\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.03916667762822585,\n\
\ \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.03916667762822585\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23504273504273504,\n\
\ \"acc_stderr\": 0.027778835904935427,\n \"acc_norm\": 0.23504273504273504,\n\
\ \"acc_norm_stderr\": 0.027778835904935427\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24521072796934865,\n\
\ \"acc_stderr\": 0.01538435228454394,\n \"acc_norm\": 0.24521072796934865,\n\
\ \"acc_norm_stderr\": 0.01538435228454394\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.024848018263875195,\n\
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.024848018263875195\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.18006430868167203,\n\
\ \"acc_stderr\": 0.021823422857744953,\n \"acc_norm\": 0.18006430868167203,\n\
\ \"acc_norm_stderr\": 0.021823422857744953\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25308641975308643,\n \"acc_stderr\": 0.02419180860071301,\n\
\ \"acc_norm\": 0.25308641975308643,\n \"acc_norm_stderr\": 0.02419180860071301\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2730496453900709,\n \"acc_stderr\": 0.026577860943307857,\n \
\ \"acc_norm\": 0.2730496453900709,\n \"acc_norm_stderr\": 0.026577860943307857\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.25554106910039115,\n\
\ \"acc_stderr\": 0.011139857833598506,\n \"acc_norm\": 0.25554106910039115,\n\
\ \"acc_norm_stderr\": 0.011139857833598506\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.030187532060329376,\n\
\ \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.030187532060329376\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.18181818181818182,\n \"acc_stderr\": 0.036942843353378,\n\
\ \"acc_norm\": 0.18181818181818182,\n \"acc_norm_stderr\": 0.036942843353378\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.031362502409358936,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.031362502409358936\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.030360490154014652,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.030360490154014652\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.22289156626506024,\n \"acc_stderr\": 0.03240004825594689,\n\
\ \"acc_norm\": 0.22289156626506024,\n \"acc_norm_stderr\": 0.03240004825594689\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.034462962170884265,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.034462962170884265\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.2460220318237454,\n \"mc1_stderr\": 0.01507721920066259,\n\
\ \"mc2\": 0.41020465472810524,\n \"mc2_stderr\": 0.015012374839842264\n\
\ }\n}\n```"
repo_url: https://huggingface.co/nicholasKluge/Aira-124M
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|arc:challenge|25_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hellaswag|10_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T19:04:35.532451.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:04:35.532451.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T19:04:35.532451.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T19:04:35.532451.parquet'
- config_name: results
data_files:
- split: 2023_08_29T19_04_35.532451
path:
- results_2023-08-29T19:04:35.532451.parquet
- split: latest
path:
- results_2023-08-29T19:04:35.532451.parquet
---
# Dataset Card for Evaluation run of nicholasKluge/Aira-124M
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/nicholasKluge/Aira-124M
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [nicholasKluge/Aira-124M](https://huggingface.co/nicholasKluge/Aira-124M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nicholasKluge__Aira-124M",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T19:04:35.532451](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-124M/blob/main/results_2023-08-29T19%3A04%3A35.532451.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25265346552614076,
"acc_stderr": 0.03117857003137413,
"acc_norm": 0.253799928797708,
"acc_norm_stderr": 0.03119563902907945,
"mc1": 0.2460220318237454,
"mc1_stderr": 0.01507721920066259,
"mc2": 0.41020465472810524,
"mc2_stderr": 0.015012374839842264
},
"harness|arc:challenge|25": {
"acc": 0.19880546075085323,
"acc_stderr": 0.01166285019817554,
"acc_norm": 0.24573378839590443,
"acc_norm_stderr": 0.012581033453730107
},
"harness|hellaswag|10": {
"acc": 0.2921728739294961,
"acc_stderr": 0.004538319464111971,
"acc_norm": 0.312885879306911,
"acc_norm_stderr": 0.004627207073171273
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2074074074074074,
"acc_stderr": 0.03502553170678316,
"acc_norm": 0.2074074074074074,
"acc_norm_stderr": 0.03502553170678316
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21132075471698114,
"acc_stderr": 0.025125766484827845,
"acc_norm": 0.21132075471698114,
"acc_norm_stderr": 0.025125766484827845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.04096985139843671,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.04096985139843671
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2620689655172414,
"acc_stderr": 0.036646663372252565,
"acc_norm": 0.2620689655172414,
"acc_norm_stderr": 0.036646663372252565
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.022261817692400168,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.022261817692400168
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03333333333333337,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03333333333333337
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.13,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.13,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.21935483870967742,
"acc_stderr": 0.023540799358723285,
"acc_norm": 0.21935483870967742,
"acc_norm_stderr": 0.023540799358723285
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.19704433497536947,
"acc_stderr": 0.02798672466673622,
"acc_norm": 0.19704433497536947,
"acc_norm_stderr": 0.02798672466673622
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.03192271569548299,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.03192271569548299
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35353535353535354,
"acc_stderr": 0.03406086723547153,
"acc_norm": 0.35353535353535354,
"acc_norm_stderr": 0.03406086723547153
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.36787564766839376,
"acc_stderr": 0.03480175668466036,
"acc_norm": 0.36787564766839376,
"acc_norm_stderr": 0.03480175668466036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.30256410256410254,
"acc_stderr": 0.023290888053772725,
"acc_norm": 0.30256410256410254,
"acc_norm_stderr": 0.023290888053772725
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.02564410863926763,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.02564410863926763
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.027381406927868966,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.027381406927868966
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2185430463576159,
"acc_stderr": 0.03374235550425694,
"acc_norm": 0.2185430463576159,
"acc_norm_stderr": 0.03374235550425694
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3467889908256881,
"acc_stderr": 0.020406097104093027,
"acc_norm": 0.3467889908256881,
"acc_norm_stderr": 0.020406097104093027
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27941176470588236,
"acc_stderr": 0.031493281045079556,
"acc_norm": 0.27941176470588236,
"acc_norm_stderr": 0.031493281045079556
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.28270042194092826,
"acc_stderr": 0.02931281415395594,
"acc_norm": 0.28270042194092826,
"acc_norm_stderr": 0.02931281415395594
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.1210762331838565,
"acc_stderr": 0.021894174113185737,
"acc_norm": 0.1210762331838565,
"acc_norm_stderr": 0.021894174113185737
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26993865030674846,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.26993865030674846,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.04246624336697624,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.04246624336697624
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23504273504273504,
"acc_stderr": 0.027778835904935427,
"acc_norm": 0.23504273504273504,
"acc_norm_stderr": 0.027778835904935427
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24521072796934865,
"acc_stderr": 0.01538435228454394,
"acc_norm": 0.24521072796934865,
"acc_norm_stderr": 0.01538435228454394
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.18006430868167203,
"acc_stderr": 0.021823422857744953,
"acc_norm": 0.18006430868167203,
"acc_norm_stderr": 0.021823422857744953
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25308641975308643,
"acc_stderr": 0.02419180860071301,
"acc_norm": 0.25308641975308643,
"acc_norm_stderr": 0.02419180860071301
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2730496453900709,
"acc_stderr": 0.026577860943307857,
"acc_norm": 0.2730496453900709,
"acc_norm_stderr": 0.026577860943307857
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.25554106910039115,
"acc_stderr": 0.011139857833598506,
"acc_norm": 0.25554106910039115,
"acc_norm_stderr": 0.011139857833598506
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.030187532060329376,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.030187532060329376
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.18181818181818182,
"acc_stderr": 0.036942843353378,
"acc_norm": 0.18181818181818182,
"acc_norm_stderr": 0.036942843353378
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4,
"acc_stderr": 0.031362502409358936,
"acc_norm": 0.4,
"acc_norm_stderr": 0.031362502409358936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014652,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014652
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-virology|5": {
"acc": 0.22289156626506024,
"acc_stderr": 0.03240004825594689,
"acc_norm": 0.22289156626506024,
"acc_norm_stderr": 0.03240004825594689
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2460220318237454,
"mc1_stderr": 0.01507721920066259,
"mc2": 0.41020465472810524,
"mc2_stderr": 0.015012374839842264
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
purvesh/MCD_ABSA | 2023-08-29T19:15:21.000Z | [
"region:us"
] | purvesh | null | null | null | 0 | 0 | |
isashap/airesumefinal | 2023-08-29T19:58:31.000Z | [
"region:us"
] | isashap | null | null | null | 0 | 0 | |
linhtran92/result_with_w2v2_baseline_snfintuned_20 | 2023-08-29T19:17:09.000Z | [
"region:us"
] | linhtran92 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: id
dtype: string
- name: w2v2_baseline_transcription
dtype: string
- name: w2v2_baseline_norm
dtype: string
splits:
- name: train
num_bytes: 174374874.625
num_examples: 1299
download_size: 164232283
dataset_size: 174374874.625
---
# Dataset Card for "result_with_w2v2_baseline_snfintuned_20"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dshut002/Mermaid_LLAMA | 2023-08-30T23:05:20.000Z | [
"region:us"
] | dshut002 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 503
num_examples: 1
download_size: 4922
dataset_size: 503
---
# Dataset Card for "Mermaid_LLAMA"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gej/fhjg | 2023-08-29T19:28:12.000Z | [
"region:us"
] | gej | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_TheBloke__Project-Baize-v2-7B-GPTQ | 2023-08-29T19:39:35.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheBloke/Project-Baize-v2-7B-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/Project-Baize-v2-7B-GPTQ](https://huggingface.co/TheBloke/Project-Baize-v2-7B-GPTQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Project-Baize-v2-7B-GPTQ\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T19:38:18.380876](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Project-Baize-v2-7B-GPTQ/blob/main/results_2023-08-29T19%3A38%3A18.380876.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3595551770399452,\n\
\ \"acc_stderr\": 0.034528095881188374,\n \"acc_norm\": 0.3628658513712869,\n\
\ \"acc_norm_stderr\": 0.03451971828647514,\n \"mc1\": 0.25458996328029376,\n\
\ \"mc1_stderr\": 0.015250117079156494,\n \"mc2\": 0.3991984976250448,\n\
\ \"mc2_stderr\": 0.015243111830696071\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.44112627986348124,\n \"acc_stderr\": 0.014509747749064663,\n\
\ \"acc_norm\": 0.4598976109215017,\n \"acc_norm_stderr\": 0.014564318856924848\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5578570005974905,\n\
\ \"acc_stderr\": 0.004956262919324401,\n \"acc_norm\": 0.7344154550886277,\n\
\ \"acc_norm_stderr\": 0.004407413723383404\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.4,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3355263157894737,\n \"acc_stderr\": 0.038424985593952694,\n\
\ \"acc_norm\": 0.3355263157894737,\n \"acc_norm_stderr\": 0.038424985593952694\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.41,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.030151134457776296,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.030151134457776296\n \
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3472222222222222,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.3472222222222222,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.28901734104046245,\n\
\ \"acc_stderr\": 0.034564257450869995,\n \"acc_norm\": 0.28901734104046245,\n\
\ \"acc_norm_stderr\": 0.034564257450869995\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n\
\ \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2936170212765957,\n \"acc_stderr\": 0.029771642712491227,\n\
\ \"acc_norm\": 0.2936170212765957,\n \"acc_norm_stderr\": 0.029771642712491227\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.04096985139843671,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.04096985139843671\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.36551724137931035,\n \"acc_stderr\": 0.04013124195424385,\n\
\ \"acc_norm\": 0.36551724137931035,\n \"acc_norm_stderr\": 0.04013124195424385\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2328042328042328,\n \"acc_stderr\": 0.02176596167215453,\n \"\
acc_norm\": 0.2328042328042328,\n \"acc_norm_stderr\": 0.02176596167215453\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.041349130183033156,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.041349130183033156\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.36451612903225805,\n\
\ \"acc_stderr\": 0.02737987122994325,\n \"acc_norm\": 0.36451612903225805,\n\
\ \"acc_norm_stderr\": 0.02737987122994325\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694436,\n\
\ \"acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694436\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \"acc_norm\"\
: 0.34,\n \"acc_norm_stderr\": 0.047609522856952344\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4484848484848485,\n \"acc_stderr\": 0.038835659779569286,\n\
\ \"acc_norm\": 0.4484848484848485,\n \"acc_norm_stderr\": 0.038835659779569286\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4292929292929293,\n \"acc_stderr\": 0.03526552724601199,\n \"\
acc_norm\": 0.4292929292929293,\n \"acc_norm_stderr\": 0.03526552724601199\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.44559585492227977,\n \"acc_stderr\": 0.035870149860756595,\n\
\ \"acc_norm\": 0.44559585492227977,\n \"acc_norm_stderr\": 0.035870149860756595\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.34102564102564104,\n \"acc_stderr\": 0.024035489676335065,\n\
\ \"acc_norm\": 0.34102564102564104,\n \"acc_norm_stderr\": 0.024035489676335065\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.33613445378151263,\n \"acc_stderr\": 0.03068473711513536,\n\
\ \"acc_norm\": 0.33613445378151263,\n \"acc_norm_stderr\": 0.03068473711513536\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.03734535676787198,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.03734535676787198\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.4036697247706422,\n \"acc_stderr\": 0.02103570485657497,\n \"\
acc_norm\": 0.4036697247706422,\n \"acc_norm_stderr\": 0.02103570485657497\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.30092592592592593,\n \"acc_stderr\": 0.031280390843298825,\n \"\
acc_norm\": 0.30092592592592593,\n \"acc_norm_stderr\": 0.031280390843298825\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4362745098039216,\n \"acc_stderr\": 0.03480693138457039,\n \"\
acc_norm\": 0.4362745098039216,\n \"acc_norm_stderr\": 0.03480693138457039\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4936708860759494,\n \"acc_stderr\": 0.03254462010767859,\n \
\ \"acc_norm\": 0.4936708860759494,\n \"acc_norm_stderr\": 0.03254462010767859\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3721973094170404,\n\
\ \"acc_stderr\": 0.032443052830087304,\n \"acc_norm\": 0.3721973094170404,\n\
\ \"acc_norm_stderr\": 0.032443052830087304\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4198473282442748,\n \"acc_stderr\": 0.04328577215262972,\n\
\ \"acc_norm\": 0.4198473282442748,\n \"acc_norm_stderr\": 0.04328577215262972\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.45454545454545453,\n \"acc_stderr\": 0.045454545454545456,\n \"\
acc_norm\": 0.45454545454545453,\n \"acc_norm_stderr\": 0.045454545454545456\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3148148148148148,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.3148148148148148,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3374233128834356,\n \"acc_stderr\": 0.037149084099355745,\n\
\ \"acc_norm\": 0.3374233128834356,\n \"acc_norm_stderr\": 0.037149084099355745\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n\
\ \"acc_stderr\": 0.04007341809755805,\n \"acc_norm\": 0.23214285714285715,\n\
\ \"acc_norm_stderr\": 0.04007341809755805\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.4077669902912621,\n \"acc_stderr\": 0.04865777570410769,\n\
\ \"acc_norm\": 0.4077669902912621,\n \"acc_norm_stderr\": 0.04865777570410769\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.03255326307272485,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.03255326307272485\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-miscellaneous|5\"\
: {\n \"acc\": 0.46998722860791825,\n \"acc_stderr\": 0.0178477230866491,\n\
\ \"acc_norm\": 0.46998722860791825,\n \"acc_norm_stderr\": 0.0178477230866491\n\
\ },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.3670520231213873,\n\
\ \"acc_stderr\": 0.025950054337654082,\n \"acc_norm\": 0.3670520231213873,\n\
\ \"acc_norm_stderr\": 0.025950054337654082\n },\n \"harness|hendrycksTest-moral_scenarios|5\"\
: {\n \"acc\": 0.25251396648044694,\n \"acc_stderr\": 0.014530330201468645,\n\
\ \"acc_norm\": 0.25251396648044694,\n \"acc_norm_stderr\": 0.014530330201468645\n\
\ },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.33986928104575165,\n\
\ \"acc_stderr\": 0.027121956071388845,\n \"acc_norm\": 0.33986928104575165,\n\
\ \"acc_norm_stderr\": 0.027121956071388845\n },\n \"harness|hendrycksTest-philosophy|5\"\
: {\n \"acc\": 0.3762057877813505,\n \"acc_stderr\": 0.02751392568354943,\n\
\ \"acc_norm\": 0.3762057877813505,\n \"acc_norm_stderr\": 0.02751392568354943\n\
\ },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.37962962962962965,\n\
\ \"acc_stderr\": 0.027002521034516478,\n \"acc_norm\": 0.37962962962962965,\n\
\ \"acc_norm_stderr\": 0.027002521034516478\n },\n \"harness|hendrycksTest-professional_accounting|5\"\
: {\n \"acc\": 0.2695035460992908,\n \"acc_stderr\": 0.02646903681859062,\n\
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.02646903681859062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.303129074315515,\n\
\ \"acc_stderr\": 0.01173866995125429,\n \"acc_norm\": 0.303129074315515,\n\
\ \"acc_norm_stderr\": 0.01173866995125429\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4522058823529412,\n \"acc_stderr\": 0.03023375855159645,\n\
\ \"acc_norm\": 0.4522058823529412,\n \"acc_norm_stderr\": 0.03023375855159645\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.30718954248366015,\n \"acc_stderr\": 0.01866335967146367,\n \
\ \"acc_norm\": 0.30718954248366015,\n \"acc_norm_stderr\": 0.01866335967146367\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.34545454545454546,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.34545454545454546,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3224489795918367,\n \"acc_stderr\": 0.029923100563683906,\n\
\ \"acc_norm\": 0.3224489795918367,\n \"acc_norm_stderr\": 0.029923100563683906\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.40298507462686567,\n\
\ \"acc_stderr\": 0.03468343295111126,\n \"acc_norm\": 0.40298507462686567,\n\
\ \"acc_norm_stderr\": 0.03468343295111126\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3373493975903614,\n\
\ \"acc_stderr\": 0.0368078369072758,\n \"acc_norm\": 0.3373493975903614,\n\
\ \"acc_norm_stderr\": 0.0368078369072758\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.038342347441649924,\n\
\ \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.038342347441649924\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25458996328029376,\n\
\ \"mc1_stderr\": 0.015250117079156494,\n \"mc2\": 0.3991984976250448,\n\
\ \"mc2_stderr\": 0.015243111830696071\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/Project-Baize-v2-7B-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|arc:challenge|25_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hellaswag|10_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T19:38:18.380876.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:38:18.380876.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T19:38:18.380876.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T19:38:18.380876.parquet'
- config_name: results
data_files:
- split: 2023_08_29T19_38_18.380876
path:
- results_2023-08-29T19:38:18.380876.parquet
- split: latest
path:
- results_2023-08-29T19:38:18.380876.parquet
---
# Dataset Card for Evaluation run of TheBloke/Project-Baize-v2-7B-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Project-Baize-v2-7B-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Project-Baize-v2-7B-GPTQ](https://huggingface.co/TheBloke/Project-Baize-v2-7B-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Project-Baize-v2-7B-GPTQ",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T19:38:18.380876](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Project-Baize-v2-7B-GPTQ/blob/main/results_2023-08-29T19%3A38%3A18.380876.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3595551770399452,
"acc_stderr": 0.034528095881188374,
"acc_norm": 0.3628658513712869,
"acc_norm_stderr": 0.03451971828647514,
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156494,
"mc2": 0.3991984976250448,
"mc2_stderr": 0.015243111830696071
},
"harness|arc:challenge|25": {
"acc": 0.44112627986348124,
"acc_stderr": 0.014509747749064663,
"acc_norm": 0.4598976109215017,
"acc_norm_stderr": 0.014564318856924848
},
"harness|hellaswag|10": {
"acc": 0.5578570005974905,
"acc_stderr": 0.004956262919324401,
"acc_norm": 0.7344154550886277,
"acc_norm_stderr": 0.004407413723383404
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3355263157894737,
"acc_stderr": 0.038424985593952694,
"acc_norm": 0.3355263157894737,
"acc_norm_stderr": 0.038424985593952694
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4,
"acc_stderr": 0.030151134457776296,
"acc_norm": 0.4,
"acc_norm_stderr": 0.030151134457776296
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.28901734104046245,
"acc_stderr": 0.034564257450869995,
"acc_norm": 0.28901734104046245,
"acc_norm_stderr": 0.034564257450869995
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2936170212765957,
"acc_stderr": 0.029771642712491227,
"acc_norm": 0.2936170212765957,
"acc_norm_stderr": 0.029771642712491227
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.04096985139843671,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.04096985139843671
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.36551724137931035,
"acc_stderr": 0.04013124195424385,
"acc_norm": 0.36551724137931035,
"acc_norm_stderr": 0.04013124195424385
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2328042328042328,
"acc_stderr": 0.02176596167215453,
"acc_norm": 0.2328042328042328,
"acc_norm_stderr": 0.02176596167215453
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.041349130183033156,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.041349130183033156
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.36451612903225805,
"acc_stderr": 0.02737987122994325,
"acc_norm": 0.36451612903225805,
"acc_norm_stderr": 0.02737987122994325
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.030516530732694436,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.030516530732694436
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952344,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952344
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4484848484848485,
"acc_stderr": 0.038835659779569286,
"acc_norm": 0.4484848484848485,
"acc_norm_stderr": 0.038835659779569286
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4292929292929293,
"acc_stderr": 0.03526552724601199,
"acc_norm": 0.4292929292929293,
"acc_norm_stderr": 0.03526552724601199
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.44559585492227977,
"acc_stderr": 0.035870149860756595,
"acc_norm": 0.44559585492227977,
"acc_norm_stderr": 0.035870149860756595
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.34102564102564104,
"acc_stderr": 0.024035489676335065,
"acc_norm": 0.34102564102564104,
"acc_norm_stderr": 0.024035489676335065
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085626,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.33613445378151263,
"acc_stderr": 0.03068473711513536,
"acc_norm": 0.33613445378151263,
"acc_norm_stderr": 0.03068473711513536
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.03734535676787198,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.03734535676787198
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.4036697247706422,
"acc_stderr": 0.02103570485657497,
"acc_norm": 0.4036697247706422,
"acc_norm_stderr": 0.02103570485657497
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.30092592592592593,
"acc_stderr": 0.031280390843298825,
"acc_norm": 0.30092592592592593,
"acc_norm_stderr": 0.031280390843298825
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4362745098039216,
"acc_stderr": 0.03480693138457039,
"acc_norm": 0.4362745098039216,
"acc_norm_stderr": 0.03480693138457039
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4936708860759494,
"acc_stderr": 0.03254462010767859,
"acc_norm": 0.4936708860759494,
"acc_norm_stderr": 0.03254462010767859
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3721973094170404,
"acc_stderr": 0.032443052830087304,
"acc_norm": 0.3721973094170404,
"acc_norm_stderr": 0.032443052830087304
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4198473282442748,
"acc_stderr": 0.04328577215262972,
"acc_norm": 0.4198473282442748,
"acc_norm_stderr": 0.04328577215262972
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.45454545454545453,
"acc_stderr": 0.045454545454545456,
"acc_norm": 0.45454545454545453,
"acc_norm_stderr": 0.045454545454545456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3374233128834356,
"acc_stderr": 0.037149084099355745,
"acc_norm": 0.3374233128834356,
"acc_norm_stderr": 0.037149084099355745
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.23214285714285715,
"acc_stderr": 0.04007341809755805,
"acc_norm": 0.23214285714285715,
"acc_norm_stderr": 0.04007341809755805
},
"harness|hendrycksTest-management|5": {
"acc": 0.4077669902912621,
"acc_stderr": 0.04865777570410769,
"acc_norm": 0.4077669902912621,
"acc_norm_stderr": 0.04865777570410769
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03255326307272485,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03255326307272485
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.46998722860791825,
"acc_stderr": 0.0178477230866491,
"acc_norm": 0.46998722860791825,
"acc_norm_stderr": 0.0178477230866491
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3670520231213873,
"acc_stderr": 0.025950054337654082,
"acc_norm": 0.3670520231213873,
"acc_norm_stderr": 0.025950054337654082
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25251396648044694,
"acc_stderr": 0.014530330201468645,
"acc_norm": 0.25251396648044694,
"acc_norm_stderr": 0.014530330201468645
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.33986928104575165,
"acc_stderr": 0.027121956071388845,
"acc_norm": 0.33986928104575165,
"acc_norm_stderr": 0.027121956071388845
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3762057877813505,
"acc_stderr": 0.02751392568354943,
"acc_norm": 0.3762057877813505,
"acc_norm_stderr": 0.02751392568354943
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.027002521034516478,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.027002521034516478
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.02646903681859062,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.02646903681859062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.303129074315515,
"acc_stderr": 0.01173866995125429,
"acc_norm": 0.303129074315515,
"acc_norm_stderr": 0.01173866995125429
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4522058823529412,
"acc_stderr": 0.03023375855159645,
"acc_norm": 0.4522058823529412,
"acc_norm_stderr": 0.03023375855159645
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.30718954248366015,
"acc_stderr": 0.01866335967146367,
"acc_norm": 0.30718954248366015,
"acc_norm_stderr": 0.01866335967146367
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.34545454545454546,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.34545454545454546,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3224489795918367,
"acc_stderr": 0.029923100563683906,
"acc_norm": 0.3224489795918367,
"acc_norm_stderr": 0.029923100563683906
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.40298507462686567,
"acc_stderr": 0.03468343295111126,
"acc_norm": 0.40298507462686567,
"acc_norm_stderr": 0.03468343295111126
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3373493975903614,
"acc_stderr": 0.0368078369072758,
"acc_norm": 0.3373493975903614,
"acc_norm_stderr": 0.0368078369072758
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.038342347441649924,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.038342347441649924
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156494,
"mc2": 0.3991984976250448,
"mc2_stderr": 0.015243111830696071
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_TFLai__Platypus2-13B-QLoRA-0.80-epoch | 2023-08-29T19:40:20.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TFLai/Platypus2-13B-QLoRA-0.80-epoch
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TFLai/Platypus2-13B-QLoRA-0.80-epoch](https://huggingface.co/TFLai/Platypus2-13B-QLoRA-0.80-epoch)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TFLai__Platypus2-13B-QLoRA-0.80-epoch\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T19:38:59.020077](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__Platypus2-13B-QLoRA-0.80-epoch/blob/main/results_2023-08-29T19%3A38%3A59.020077.json):\n\
\n```python\n{\n \"all\": {\n \"acc\": 0.5569397615500319,\n \"\
acc_stderr\": 0.03437378885442287,\n \"acc_norm\": 0.5610397654591069,\n\
\ \"acc_norm_stderr\": 0.03435467586523047,\n \"mc1\": 0.2766217870257038,\n\
\ \"mc1_stderr\": 0.015659605755326923,\n \"mc2\": 0.39698245763845513,\n\
\ \"mc2_stderr\": 0.014107477141963906\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5443686006825939,\n \"acc_stderr\": 0.014553749939306868,\n\
\ \"acc_norm\": 0.5776450511945392,\n \"acc_norm_stderr\": 0.01443413871337998\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.607647878908584,\n\
\ \"acc_stderr\": 0.004872765504069852,\n \"acc_norm\": 0.8162716590320653,\n\
\ \"acc_norm_stderr\": 0.0038647103676450554\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909281\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5592105263157895,\n \"acc_stderr\": 0.04040311062490437,\n\
\ \"acc_norm\": 0.5592105263157895,\n \"acc_norm_stderr\": 0.04040311062490437\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5849056603773585,\n \"acc_stderr\": 0.03032594578928611,\n\
\ \"acc_norm\": 0.5849056603773585,\n \"acc_norm_stderr\": 0.03032594578928611\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6319444444444444,\n\
\ \"acc_stderr\": 0.040329990539607195,\n \"acc_norm\": 0.6319444444444444,\n\
\ \"acc_norm_stderr\": 0.040329990539607195\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5086705202312138,\n\
\ \"acc_stderr\": 0.038118909889404126,\n \"acc_norm\": 0.5086705202312138,\n\
\ \"acc_norm_stderr\": 0.038118909889404126\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.03261936918467381,\n\
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.03261936918467381\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.024278568024307706,\n \"\
acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.024278568024307706\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6645161290322581,\n\
\ \"acc_stderr\": 0.026860206444724345,\n \"acc_norm\": 0.6645161290322581,\n\
\ \"acc_norm_stderr\": 0.026860206444724345\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.03481904844438804,\n\
\ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.03481904844438804\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7323232323232324,\n \"acc_stderr\": 0.031544498882702846,\n \"\
acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.031544498882702846\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.02840895362624527,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.02840895362624527\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5025641025641026,\n \"acc_stderr\": 0.025350672979412195,\n\
\ \"acc_norm\": 0.5025641025641026,\n \"acc_norm_stderr\": 0.025350672979412195\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608463,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608463\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5378151260504201,\n \"acc_stderr\": 0.032385469487589795,\n\
\ \"acc_norm\": 0.5378151260504201,\n \"acc_norm_stderr\": 0.032385469487589795\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7688073394495413,\n \"acc_stderr\": 0.01807575024163315,\n \"\
acc_norm\": 0.7688073394495413,\n \"acc_norm_stderr\": 0.01807575024163315\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.034028015813589656,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.034028015813589656\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7156862745098039,\n \"acc_stderr\": 0.03166009679399811,\n \"\
acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.03166009679399811\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302873,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302873\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6380368098159509,\n \"acc_stderr\": 0.037757007291414416,\n\
\ \"acc_norm\": 0.6380368098159509,\n \"acc_norm_stderr\": 0.037757007291414416\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.042466243366976235,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.042466243366976235\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.025140935950335452,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.025140935950335452\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.768837803320562,\n\
\ \"acc_stderr\": 0.015075523238101083,\n \"acc_norm\": 0.768837803320562,\n\
\ \"acc_norm_stderr\": 0.015075523238101083\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.026152198619726803,\n\
\ \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.026152198619726803\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28938547486033517,\n\
\ \"acc_stderr\": 0.015166544550490317,\n \"acc_norm\": 0.28938547486033517,\n\
\ \"acc_norm_stderr\": 0.015166544550490317\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5915032679738562,\n \"acc_stderr\": 0.028146405993096358,\n\
\ \"acc_norm\": 0.5915032679738562,\n \"acc_norm_stderr\": 0.028146405993096358\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.617363344051447,\n\
\ \"acc_stderr\": 0.027604689028581996,\n \"acc_norm\": 0.617363344051447,\n\
\ \"acc_norm_stderr\": 0.027604689028581996\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.026869490744815247,\n\
\ \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.026869490744815247\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4219858156028369,\n \"acc_stderr\": 0.0294621892333706,\n \
\ \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.0294621892333706\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4302477183833116,\n\
\ \"acc_stderr\": 0.012645361435115228,\n \"acc_norm\": 0.4302477183833116,\n\
\ \"acc_norm_stderr\": 0.012645361435115228\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.029896163033125468,\n\
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.029896163033125468\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5964052287581699,\n \"acc_stderr\": 0.019848280168401147,\n \
\ \"acc_norm\": 0.5964052287581699,\n \"acc_norm_stderr\": 0.019848280168401147\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5877551020408164,\n \"acc_stderr\": 0.03151236044674268,\n\
\ \"acc_norm\": 0.5877551020408164,\n \"acc_norm_stderr\": 0.03151236044674268\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n\
\ \"acc_stderr\": 0.031524391865554016,\n \"acc_norm\": 0.7263681592039801,\n\
\ \"acc_norm_stderr\": 0.031524391865554016\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n\
\ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2766217870257038,\n\
\ \"mc1_stderr\": 0.015659605755326923,\n \"mc2\": 0.39698245763845513,\n\
\ \"mc2_stderr\": 0.014107477141963906\n }\n}\n```"
repo_url: https://huggingface.co/TFLai/Platypus2-13B-QLoRA-0.80-epoch
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|arc:challenge|25_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hellaswag|10_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T19:38:59.020077.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:38:59.020077.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T19:38:59.020077.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T19:38:59.020077.parquet'
- config_name: results
data_files:
- split: 2023_08_29T19_38_59.020077
path:
- results_2023-08-29T19:38:59.020077.parquet
- split: latest
path:
- results_2023-08-29T19:38:59.020077.parquet
---
# Dataset Card for Evaluation run of TFLai/Platypus2-13B-QLoRA-0.80-epoch
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TFLai/Platypus2-13B-QLoRA-0.80-epoch
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TFLai/Platypus2-13B-QLoRA-0.80-epoch](https://huggingface.co/TFLai/Platypus2-13B-QLoRA-0.80-epoch) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TFLai__Platypus2-13B-QLoRA-0.80-epoch",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T19:38:59.020077](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__Platypus2-13B-QLoRA-0.80-epoch/blob/main/results_2023-08-29T19%3A38%3A59.020077.json):
```python
{
"all": {
"acc": 0.5569397615500319,
"acc_stderr": 0.03437378885442287,
"acc_norm": 0.5610397654591069,
"acc_norm_stderr": 0.03435467586523047,
"mc1": 0.2766217870257038,
"mc1_stderr": 0.015659605755326923,
"mc2": 0.39698245763845513,
"mc2_stderr": 0.014107477141963906
},
"harness|arc:challenge|25": {
"acc": 0.5443686006825939,
"acc_stderr": 0.014553749939306868,
"acc_norm": 0.5776450511945392,
"acc_norm_stderr": 0.01443413871337998
},
"harness|hellaswag|10": {
"acc": 0.607647878908584,
"acc_stderr": 0.004872765504069852,
"acc_norm": 0.8162716590320653,
"acc_norm_stderr": 0.0038647103676450554
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5592105263157895,
"acc_stderr": 0.04040311062490437,
"acc_norm": 0.5592105263157895,
"acc_norm_stderr": 0.04040311062490437
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5849056603773585,
"acc_stderr": 0.03032594578928611,
"acc_norm": 0.5849056603773585,
"acc_norm_stderr": 0.03032594578928611
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6319444444444444,
"acc_stderr": 0.040329990539607195,
"acc_norm": 0.6319444444444444,
"acc_norm_stderr": 0.040329990539607195
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5086705202312138,
"acc_stderr": 0.038118909889404126,
"acc_norm": 0.5086705202312138,
"acc_norm_stderr": 0.038118909889404126
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.03261936918467381,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.03261936918467381
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.024278568024307706,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.024278568024307706
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.026860206444724345,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.026860206444724345
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.03481904844438804,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.03481904844438804
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7323232323232324,
"acc_stderr": 0.031544498882702846,
"acc_norm": 0.7323232323232324,
"acc_norm_stderr": 0.031544498882702846
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.02840895362624527,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.02840895362624527
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5025641025641026,
"acc_stderr": 0.025350672979412195,
"acc_norm": 0.5025641025641026,
"acc_norm_stderr": 0.025350672979412195
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608463,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608463
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5378151260504201,
"acc_stderr": 0.032385469487589795,
"acc_norm": 0.5378151260504201,
"acc_norm_stderr": 0.032385469487589795
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7688073394495413,
"acc_stderr": 0.01807575024163315,
"acc_norm": 0.7688073394495413,
"acc_norm_stderr": 0.01807575024163315
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.034028015813589656,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.034028015813589656
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.03166009679399811,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.03166009679399811
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7383966244725738,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.7383966244725738,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302873,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302873
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04557239513497751,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04557239513497751
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6380368098159509,
"acc_stderr": 0.037757007291414416,
"acc_norm": 0.6380368098159509,
"acc_norm_stderr": 0.037757007291414416
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976235,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976235
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.025140935950335452,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.025140935950335452
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.768837803320562,
"acc_stderr": 0.015075523238101083,
"acc_norm": 0.768837803320562,
"acc_norm_stderr": 0.015075523238101083
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.026152198619726803,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.026152198619726803
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.28938547486033517,
"acc_stderr": 0.015166544550490317,
"acc_norm": 0.28938547486033517,
"acc_norm_stderr": 0.015166544550490317
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5915032679738562,
"acc_stderr": 0.028146405993096358,
"acc_norm": 0.5915032679738562,
"acc_norm_stderr": 0.028146405993096358
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.617363344051447,
"acc_stderr": 0.027604689028581996,
"acc_norm": 0.617363344051447,
"acc_norm_stderr": 0.027604689028581996
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.026869490744815247,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.026869490744815247
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.0294621892333706,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.0294621892333706
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4302477183833116,
"acc_stderr": 0.012645361435115228,
"acc_norm": 0.4302477183833116,
"acc_norm_stderr": 0.012645361435115228
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.029896163033125468,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.029896163033125468
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5964052287581699,
"acc_stderr": 0.019848280168401147,
"acc_norm": 0.5964052287581699,
"acc_norm_stderr": 0.019848280168401147
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5877551020408164,
"acc_stderr": 0.03151236044674268,
"acc_norm": 0.5877551020408164,
"acc_norm_stderr": 0.03151236044674268
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.031524391865554016,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.031524391865554016
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2766217870257038,
"mc1_stderr": 0.015659605755326923,
"mc2": 0.39698245763845513,
"mc2_stderr": 0.014107477141963906
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
gsl22/ds-gaba | 2023-08-29T20:04:41.000Z | [
"region:us"
] | gsl22 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT_ds_eli5_1024_r_64_alpha_16 | 2023-08-29T19:44:14.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of dhmeltzer/llama-7b-SFT_ds_eli5_1024_r_64_alpha_16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [dhmeltzer/llama-7b-SFT_ds_eli5_1024_r_64_alpha_16](https://huggingface.co/dhmeltzer/llama-7b-SFT_ds_eli5_1024_r_64_alpha_16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT_ds_eli5_1024_r_64_alpha_16\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T19:42:53.068722](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT_ds_eli5_1024_r_64_alpha_16/blob/main/results_2023-08-29T19%3A42%3A53.068722.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4697707641857931,\n\
\ \"acc_stderr\": 0.03530390982737618,\n \"acc_norm\": 0.47385696017732193,\n\
\ \"acc_norm_stderr\": 0.03528934315282101,\n \"mc1\": 0.2937576499388005,\n\
\ \"mc1_stderr\": 0.015945068581236614,\n \"mc2\": 0.4513950473642844,\n\
\ \"mc2_stderr\": 0.014104980028751905\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.48976109215017066,\n \"acc_stderr\": 0.014608326906285019,\n\
\ \"acc_norm\": 0.5332764505119454,\n \"acc_norm_stderr\": 0.014578995859605804\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5907189802828122,\n\
\ \"acc_stderr\": 0.004906962980328293,\n \"acc_norm\": 0.7882891854212308,\n\
\ \"acc_norm_stderr\": 0.004076860228251773\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.40131578947368424,\n \"acc_stderr\": 0.03988903703336284,\n\
\ \"acc_norm\": 0.40131578947368424,\n \"acc_norm_stderr\": 0.03988903703336284\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.45660377358490567,\n \"acc_stderr\": 0.030656748696739438,\n\
\ \"acc_norm\": 0.45660377358490567,\n \"acc_norm_stderr\": 0.030656748696739438\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4583333333333333,\n\
\ \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.4583333333333333,\n\
\ \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952344\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\
: 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.41040462427745666,\n\
\ \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.41040462427745666,\n\
\ \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617746,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617746\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.03232146916224469,\n\
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.03232146916224469\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.040824829046386284,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.040824829046386284\n \
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2751322751322751,\n \"acc_stderr\": 0.02300008685906864,\n \"\
acc_norm\": 0.2751322751322751,\n \"acc_norm_stderr\": 0.02300008685906864\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4935483870967742,\n\
\ \"acc_stderr\": 0.02844163823354051,\n \"acc_norm\": 0.4935483870967742,\n\
\ \"acc_norm_stderr\": 0.02844163823354051\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3251231527093596,\n \"acc_stderr\": 0.03295797566311271,\n\
\ \"acc_norm\": 0.3251231527093596,\n \"acc_norm_stderr\": 0.03295797566311271\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\"\
: 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5878787878787879,\n \"acc_stderr\": 0.038435669935887165,\n\
\ \"acc_norm\": 0.5878787878787879,\n \"acc_norm_stderr\": 0.038435669935887165\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5050505050505051,\n \"acc_stderr\": 0.035621707606254015,\n \"\
acc_norm\": 0.5050505050505051,\n \"acc_norm_stderr\": 0.035621707606254015\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6994818652849741,\n \"acc_stderr\": 0.03308818594415751,\n\
\ \"acc_norm\": 0.6994818652849741,\n \"acc_norm_stderr\": 0.03308818594415751\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4564102564102564,\n \"acc_stderr\": 0.025254485424799605,\n\
\ \"acc_norm\": 0.4564102564102564,\n \"acc_norm_stderr\": 0.025254485424799605\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.02763490726417854,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.02763490726417854\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.42436974789915966,\n \"acc_stderr\": 0.032104790510157764,\n\
\ \"acc_norm\": 0.42436974789915966,\n \"acc_norm_stderr\": 0.032104790510157764\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6385321100917432,\n \"acc_stderr\": 0.020598082009937374,\n \"\
acc_norm\": 0.6385321100917432,\n \"acc_norm_stderr\": 0.020598082009937374\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2824074074074074,\n \"acc_stderr\": 0.030701372111510934,\n \"\
acc_norm\": 0.2824074074074074,\n \"acc_norm_stderr\": 0.030701372111510934\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5588235294117647,\n \"acc_stderr\": 0.034849415144292316,\n \"\
acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.034849415144292316\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6075949367088608,\n \"acc_stderr\": 0.03178471874564729,\n \
\ \"acc_norm\": 0.6075949367088608,\n \"acc_norm_stderr\": 0.03178471874564729\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.57847533632287,\n\
\ \"acc_stderr\": 0.03314190222110657,\n \"acc_norm\": 0.57847533632287,\n\
\ \"acc_norm_stderr\": 0.03314190222110657\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.04328577215262972,\n\
\ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.04328577215262972\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6115702479338843,\n \"acc_stderr\": 0.04449270350068383,\n \"\
acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.04449270350068383\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5092592592592593,\n\
\ \"acc_stderr\": 0.04832853553437056,\n \"acc_norm\": 0.5092592592592593,\n\
\ \"acc_norm_stderr\": 0.04832853553437056\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.50920245398773,\n \"acc_stderr\": 0.03927705600787443,\n\
\ \"acc_norm\": 0.50920245398773,\n \"acc_norm_stderr\": 0.03927705600787443\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5728155339805825,\n \"acc_stderr\": 0.048979577377811674,\n\
\ \"acc_norm\": 0.5728155339805825,\n \"acc_norm_stderr\": 0.048979577377811674\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6923076923076923,\n\
\ \"acc_stderr\": 0.030236389942173078,\n \"acc_norm\": 0.6923076923076923,\n\
\ \"acc_norm_stderr\": 0.030236389942173078\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6372924648786717,\n\
\ \"acc_stderr\": 0.017192708674602306,\n \"acc_norm\": 0.6372924648786717,\n\
\ \"acc_norm_stderr\": 0.017192708674602306\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5028901734104047,\n \"acc_stderr\": 0.026918645383239004,\n\
\ \"acc_norm\": 0.5028901734104047,\n \"acc_norm_stderr\": 0.026918645383239004\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5032679738562091,\n \"acc_stderr\": 0.028629305194003543,\n\
\ \"acc_norm\": 0.5032679738562091,\n \"acc_norm_stderr\": 0.028629305194003543\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5980707395498392,\n\
\ \"acc_stderr\": 0.027846476005930473,\n \"acc_norm\": 0.5980707395498392,\n\
\ \"acc_norm_stderr\": 0.027846476005930473\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.47530864197530864,\n \"acc_stderr\": 0.027786800931427443,\n\
\ \"acc_norm\": 0.47530864197530864,\n \"acc_norm_stderr\": 0.027786800931427443\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.34397163120567376,\n \"acc_stderr\": 0.028338017428611324,\n \
\ \"acc_norm\": 0.34397163120567376,\n \"acc_norm_stderr\": 0.028338017428611324\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.37027379400260757,\n\
\ \"acc_stderr\": 0.01233293078125673,\n \"acc_norm\": 0.37027379400260757,\n\
\ \"acc_norm_stderr\": 0.01233293078125673\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5220588235294118,\n \"acc_stderr\": 0.030343264224213535,\n\
\ \"acc_norm\": 0.5220588235294118,\n \"acc_norm_stderr\": 0.030343264224213535\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4444444444444444,\n \"acc_stderr\": 0.020102583895887188,\n \
\ \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.020102583895887188\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n\
\ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n\
\ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4897959183673469,\n \"acc_stderr\": 0.03200255347893782,\n\
\ \"acc_norm\": 0.4897959183673469,\n \"acc_norm_stderr\": 0.03200255347893782\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6567164179104478,\n\
\ \"acc_stderr\": 0.03357379665433431,\n \"acc_norm\": 0.6567164179104478,\n\
\ \"acc_norm_stderr\": 0.03357379665433431\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.034462962170884265,\n\
\ \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.034462962170884265\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2937576499388005,\n\
\ \"mc1_stderr\": 0.015945068581236614,\n \"mc2\": 0.4513950473642844,\n\
\ \"mc2_stderr\": 0.014104980028751905\n }\n}\n```"
repo_url: https://huggingface.co/dhmeltzer/llama-7b-SFT_ds_eli5_1024_r_64_alpha_16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|arc:challenge|25_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hellaswag|10_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T19:42:53.068722.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:42:53.068722.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T19:42:53.068722.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T19:42:53.068722.parquet'
- config_name: results
data_files:
- split: 2023_08_29T19_42_53.068722
path:
- results_2023-08-29T19:42:53.068722.parquet
- split: latest
path:
- results_2023-08-29T19:42:53.068722.parquet
---
# Dataset Card for Evaluation run of dhmeltzer/llama-7b-SFT_ds_eli5_1024_r_64_alpha_16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/dhmeltzer/llama-7b-SFT_ds_eli5_1024_r_64_alpha_16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [dhmeltzer/llama-7b-SFT_ds_eli5_1024_r_64_alpha_16](https://huggingface.co/dhmeltzer/llama-7b-SFT_ds_eli5_1024_r_64_alpha_16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT_ds_eli5_1024_r_64_alpha_16",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T19:42:53.068722](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT_ds_eli5_1024_r_64_alpha_16/blob/main/results_2023-08-29T19%3A42%3A53.068722.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4697707641857931,
"acc_stderr": 0.03530390982737618,
"acc_norm": 0.47385696017732193,
"acc_norm_stderr": 0.03528934315282101,
"mc1": 0.2937576499388005,
"mc1_stderr": 0.015945068581236614,
"mc2": 0.4513950473642844,
"mc2_stderr": 0.014104980028751905
},
"harness|arc:challenge|25": {
"acc": 0.48976109215017066,
"acc_stderr": 0.014608326906285019,
"acc_norm": 0.5332764505119454,
"acc_norm_stderr": 0.014578995859605804
},
"harness|hellaswag|10": {
"acc": 0.5907189802828122,
"acc_stderr": 0.004906962980328293,
"acc_norm": 0.7882891854212308,
"acc_norm_stderr": 0.004076860228251773
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.40131578947368424,
"acc_stderr": 0.03988903703336284,
"acc_norm": 0.40131578947368424,
"acc_norm_stderr": 0.03988903703336284
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.45660377358490567,
"acc_stderr": 0.030656748696739438,
"acc_norm": 0.45660377358490567,
"acc_norm_stderr": 0.030656748696739438
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.04166666666666665,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.04166666666666665
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952344,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952344
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.41040462427745666,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.41040462427745666,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617746,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617746
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.03232146916224469,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.03232146916224469
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4,
"acc_stderr": 0.040824829046386284,
"acc_norm": 0.4,
"acc_norm_stderr": 0.040824829046386284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2751322751322751,
"acc_stderr": 0.02300008685906864,
"acc_norm": 0.2751322751322751,
"acc_norm_stderr": 0.02300008685906864
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4935483870967742,
"acc_stderr": 0.02844163823354051,
"acc_norm": 0.4935483870967742,
"acc_norm_stderr": 0.02844163823354051
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3251231527093596,
"acc_stderr": 0.03295797566311271,
"acc_norm": 0.3251231527093596,
"acc_norm_stderr": 0.03295797566311271
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5878787878787879,
"acc_stderr": 0.038435669935887165,
"acc_norm": 0.5878787878787879,
"acc_norm_stderr": 0.038435669935887165
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5050505050505051,
"acc_stderr": 0.035621707606254015,
"acc_norm": 0.5050505050505051,
"acc_norm_stderr": 0.035621707606254015
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6994818652849741,
"acc_stderr": 0.03308818594415751,
"acc_norm": 0.6994818652849741,
"acc_norm_stderr": 0.03308818594415751
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4564102564102564,
"acc_stderr": 0.025254485424799605,
"acc_norm": 0.4564102564102564,
"acc_norm_stderr": 0.025254485424799605
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.02763490726417854,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.02763490726417854
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42436974789915966,
"acc_stderr": 0.032104790510157764,
"acc_norm": 0.42436974789915966,
"acc_norm_stderr": 0.032104790510157764
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6385321100917432,
"acc_stderr": 0.020598082009937374,
"acc_norm": 0.6385321100917432,
"acc_norm_stderr": 0.020598082009937374
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2824074074074074,
"acc_stderr": 0.030701372111510934,
"acc_norm": 0.2824074074074074,
"acc_norm_stderr": 0.030701372111510934
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.034849415144292316,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.034849415144292316
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6075949367088608,
"acc_stderr": 0.03178471874564729,
"acc_norm": 0.6075949367088608,
"acc_norm_stderr": 0.03178471874564729
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.57847533632287,
"acc_stderr": 0.03314190222110657,
"acc_norm": 0.57847533632287,
"acc_norm_stderr": 0.03314190222110657
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.04328577215262972,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.04328577215262972
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6115702479338843,
"acc_stderr": 0.04449270350068383,
"acc_norm": 0.6115702479338843,
"acc_norm_stderr": 0.04449270350068383
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.04832853553437056,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.04832853553437056
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.50920245398773,
"acc_stderr": 0.03927705600787443,
"acc_norm": 0.50920245398773,
"acc_norm_stderr": 0.03927705600787443
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.5728155339805825,
"acc_stderr": 0.048979577377811674,
"acc_norm": 0.5728155339805825,
"acc_norm_stderr": 0.048979577377811674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6923076923076923,
"acc_stderr": 0.030236389942173078,
"acc_norm": 0.6923076923076923,
"acc_norm_stderr": 0.030236389942173078
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6372924648786717,
"acc_stderr": 0.017192708674602306,
"acc_norm": 0.6372924648786717,
"acc_norm_stderr": 0.017192708674602306
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5028901734104047,
"acc_stderr": 0.026918645383239004,
"acc_norm": 0.5028901734104047,
"acc_norm_stderr": 0.026918645383239004
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5032679738562091,
"acc_stderr": 0.028629305194003543,
"acc_norm": 0.5032679738562091,
"acc_norm_stderr": 0.028629305194003543
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5980707395498392,
"acc_stderr": 0.027846476005930473,
"acc_norm": 0.5980707395498392,
"acc_norm_stderr": 0.027846476005930473
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.47530864197530864,
"acc_stderr": 0.027786800931427443,
"acc_norm": 0.47530864197530864,
"acc_norm_stderr": 0.027786800931427443
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.34397163120567376,
"acc_stderr": 0.028338017428611324,
"acc_norm": 0.34397163120567376,
"acc_norm_stderr": 0.028338017428611324
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.37027379400260757,
"acc_stderr": 0.01233293078125673,
"acc_norm": 0.37027379400260757,
"acc_norm_stderr": 0.01233293078125673
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5220588235294118,
"acc_stderr": 0.030343264224213535,
"acc_norm": 0.5220588235294118,
"acc_norm_stderr": 0.030343264224213535
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.020102583895887188,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.020102583895887188
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4897959183673469,
"acc_stderr": 0.03200255347893782,
"acc_norm": 0.4897959183673469,
"acc_norm_stderr": 0.03200255347893782
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6567164179104478,
"acc_stderr": 0.03357379665433431,
"acc_norm": 0.6567164179104478,
"acc_norm_stderr": 0.03357379665433431
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7192982456140351,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.7192982456140351,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2937576499388005,
"mc1_stderr": 0.015945068581236614,
"mc2": 0.4513950473642844,
"mc2_stderr": 0.014104980028751905
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_dhmeltzer__Llama-2-7b-hf-wiki30k_r_64_alpha_16 | 2023-08-29T19:47:03.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of dhmeltzer/Llama-2-7b-hf-wiki30k_r_64_alpha_16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [dhmeltzer/Llama-2-7b-hf-wiki30k_r_64_alpha_16](https://huggingface.co/dhmeltzer/Llama-2-7b-hf-wiki30k_r_64_alpha_16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dhmeltzer__Llama-2-7b-hf-wiki30k_r_64_alpha_16\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T19:45:42.675668](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__Llama-2-7b-hf-wiki30k_r_64_alpha_16/blob/main/results_2023-08-29T19%3A45%3A42.675668.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4674646723979816,\n\
\ \"acc_stderr\": 0.03520803561024559,\n \"acc_norm\": 0.47144963624975206,\n\
\ \"acc_norm_stderr\": 0.03519372000845246,\n \"mc1\": 0.24479804161566707,\n\
\ \"mc1_stderr\": 0.015051869486715014,\n \"mc2\": 0.38637509679052146,\n\
\ \"mc2_stderr\": 0.013509815622124081\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4948805460750853,\n \"acc_stderr\": 0.01461062489030916,\n\
\ \"acc_norm\": 0.5324232081911263,\n \"acc_norm_stderr\": 0.014580637569995421\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5877315275841466,\n\
\ \"acc_stderr\": 0.004912370023913015,\n \"acc_norm\": 0.7853017327225652,\n\
\ \"acc_norm_stderr\": 0.004097736838432052\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.39473684210526316,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.39473684210526316,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4528301886792453,\n \"acc_stderr\": 0.03063562795796182,\n\
\ \"acc_norm\": 0.4528301886792453,\n \"acc_norm_stderr\": 0.03063562795796182\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4513888888888889,\n\
\ \"acc_stderr\": 0.04161402398403279,\n \"acc_norm\": 0.4513888888888889,\n\
\ \"acc_norm_stderr\": 0.04161402398403279\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n\
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.41040462427745666,\n\
\ \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.41040462427745666,\n\
\ \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.032232762667117124,\n\
\ \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.032232762667117124\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159393,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159393\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708628,\n \"\
acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708628\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4967741935483871,\n\
\ \"acc_stderr\": 0.02844341422643833,\n \"acc_norm\": 0.4967741935483871,\n\
\ \"acc_norm_stderr\": 0.02844341422643833\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3497536945812808,\n \"acc_stderr\": 0.03355400904969566,\n\
\ \"acc_norm\": 0.3497536945812808,\n \"acc_norm_stderr\": 0.03355400904969566\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.593939393939394,\n \"acc_stderr\": 0.03834816355401181,\n\
\ \"acc_norm\": 0.593939393939394,\n \"acc_norm_stderr\": 0.03834816355401181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4898989898989899,\n \"acc_stderr\": 0.03561625488673745,\n \"\
acc_norm\": 0.4898989898989899,\n \"acc_norm_stderr\": 0.03561625488673745\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7150259067357513,\n \"acc_stderr\": 0.0325771407770966,\n\
\ \"acc_norm\": 0.7150259067357513,\n \"acc_norm_stderr\": 0.0325771407770966\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.43846153846153846,\n \"acc_stderr\": 0.025158266016868564,\n\
\ \"acc_norm\": 0.43846153846153846,\n \"acc_norm_stderr\": 0.025158266016868564\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945277,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945277\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4327731092436975,\n \"acc_stderr\": 0.03218358107742613,\n \
\ \"acc_norm\": 0.4327731092436975,\n \"acc_norm_stderr\": 0.03218358107742613\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.634862385321101,\n\
\ \"acc_stderr\": 0.02064280145438401,\n \"acc_norm\": 0.634862385321101,\n\
\ \"acc_norm_stderr\": 0.02064280145438401\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.2638888888888889,\n \"acc_stderr\": 0.030058202704309846,\n\
\ \"acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.030058202704309846\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5637254901960784,\n \"acc_stderr\": 0.034806931384570396,\n \"\
acc_norm\": 0.5637254901960784,\n \"acc_norm_stderr\": 0.034806931384570396\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6118143459915611,\n \"acc_stderr\": 0.031722950043323296,\n \
\ \"acc_norm\": 0.6118143459915611,\n \"acc_norm_stderr\": 0.031722950043323296\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5605381165919282,\n\
\ \"acc_stderr\": 0.03331092511038179,\n \"acc_norm\": 0.5605381165919282,\n\
\ \"acc_norm_stderr\": 0.03331092511038179\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.04328577215262972,\n\
\ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.04328577215262972\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\"\
: 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.048262172941398944,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.048262172941398944\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.50920245398773,\n \"acc_stderr\": 0.03927705600787443,\n\
\ \"acc_norm\": 0.50920245398773,\n \"acc_norm_stderr\": 0.03927705600787443\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5436893203883495,\n \"acc_stderr\": 0.04931801994220416,\n\
\ \"acc_norm\": 0.5436893203883495,\n \"acc_norm_stderr\": 0.04931801994220416\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6837606837606838,\n\
\ \"acc_stderr\": 0.03046365674734027,\n \"acc_norm\": 0.6837606837606838,\n\
\ \"acc_norm_stderr\": 0.03046365674734027\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6398467432950191,\n\
\ \"acc_stderr\": 0.017166362471369306,\n \"acc_norm\": 0.6398467432950191,\n\
\ \"acc_norm_stderr\": 0.017166362471369306\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.026919095102908273,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.026919095102908273\n \
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4803921568627451,\n \"acc_stderr\": 0.028607893699576066,\n\
\ \"acc_norm\": 0.4803921568627451,\n \"acc_norm_stderr\": 0.028607893699576066\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5787781350482315,\n\
\ \"acc_stderr\": 0.028043399858210628,\n \"acc_norm\": 0.5787781350482315,\n\
\ \"acc_norm_stderr\": 0.028043399858210628\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5061728395061729,\n \"acc_stderr\": 0.027818623962583295,\n\
\ \"acc_norm\": 0.5061728395061729,\n \"acc_norm_stderr\": 0.027818623962583295\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36524822695035464,\n \"acc_stderr\": 0.028723863853281278,\n \
\ \"acc_norm\": 0.36524822695035464,\n \"acc_norm_stderr\": 0.028723863853281278\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.37222946544980445,\n\
\ \"acc_stderr\": 0.01234624129720437,\n \"acc_norm\": 0.37222946544980445,\n\
\ \"acc_norm_stderr\": 0.01234624129720437\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.03033257809455504,\n\
\ \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.03033257809455504\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.45751633986928103,\n \"acc_stderr\": 0.02015468571259089,\n \
\ \"acc_norm\": 0.45751633986928103,\n \"acc_norm_stderr\": 0.02015468571259089\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n\
\ \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n\
\ \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.49387755102040815,\n \"acc_stderr\": 0.03200682020163908,\n\
\ \"acc_norm\": 0.49387755102040815,\n \"acc_norm_stderr\": 0.03200682020163908\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6417910447761194,\n\
\ \"acc_stderr\": 0.03390393042268813,\n \"acc_norm\": 0.6417910447761194,\n\
\ \"acc_norm_stderr\": 0.03390393042268813\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079023,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079023\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.03467826685703826,\n\
\ \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.03467826685703826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24479804161566707,\n\
\ \"mc1_stderr\": 0.015051869486715014,\n \"mc2\": 0.38637509679052146,\n\
\ \"mc2_stderr\": 0.013509815622124081\n }\n}\n```"
repo_url: https://huggingface.co/dhmeltzer/Llama-2-7b-hf-wiki30k_r_64_alpha_16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|arc:challenge|25_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hellaswag|10_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T19:45:42.675668.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:45:42.675668.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T19:45:42.675668.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T19:45:42.675668.parquet'
- config_name: results
data_files:
- split: 2023_08_29T19_45_42.675668
path:
- results_2023-08-29T19:45:42.675668.parquet
- split: latest
path:
- results_2023-08-29T19:45:42.675668.parquet
---
# Dataset Card for Evaluation run of dhmeltzer/Llama-2-7b-hf-wiki30k_r_64_alpha_16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/dhmeltzer/Llama-2-7b-hf-wiki30k_r_64_alpha_16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [dhmeltzer/Llama-2-7b-hf-wiki30k_r_64_alpha_16](https://huggingface.co/dhmeltzer/Llama-2-7b-hf-wiki30k_r_64_alpha_16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dhmeltzer__Llama-2-7b-hf-wiki30k_r_64_alpha_16",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T19:45:42.675668](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__Llama-2-7b-hf-wiki30k_r_64_alpha_16/blob/main/results_2023-08-29T19%3A45%3A42.675668.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4674646723979816,
"acc_stderr": 0.03520803561024559,
"acc_norm": 0.47144963624975206,
"acc_norm_stderr": 0.03519372000845246,
"mc1": 0.24479804161566707,
"mc1_stderr": 0.015051869486715014,
"mc2": 0.38637509679052146,
"mc2_stderr": 0.013509815622124081
},
"harness|arc:challenge|25": {
"acc": 0.4948805460750853,
"acc_stderr": 0.01461062489030916,
"acc_norm": 0.5324232081911263,
"acc_norm_stderr": 0.014580637569995421
},
"harness|hellaswag|10": {
"acc": 0.5877315275841466,
"acc_stderr": 0.004912370023913015,
"acc_norm": 0.7853017327225652,
"acc_norm_stderr": 0.004097736838432052
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4528301886792453,
"acc_stderr": 0.03063562795796182,
"acc_norm": 0.4528301886792453,
"acc_norm_stderr": 0.03063562795796182
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4513888888888889,
"acc_stderr": 0.04161402398403279,
"acc_norm": 0.4513888888888889,
"acc_norm_stderr": 0.04161402398403279
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.41040462427745666,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.41040462427745666,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.032232762667117124,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.032232762667117124
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159393,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159393
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708628,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708628
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4967741935483871,
"acc_stderr": 0.02844341422643833,
"acc_norm": 0.4967741935483871,
"acc_norm_stderr": 0.02844341422643833
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3497536945812808,
"acc_stderr": 0.03355400904969566,
"acc_norm": 0.3497536945812808,
"acc_norm_stderr": 0.03355400904969566
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.593939393939394,
"acc_stderr": 0.03834816355401181,
"acc_norm": 0.593939393939394,
"acc_norm_stderr": 0.03834816355401181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4898989898989899,
"acc_stderr": 0.03561625488673745,
"acc_norm": 0.4898989898989899,
"acc_norm_stderr": 0.03561625488673745
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7150259067357513,
"acc_stderr": 0.0325771407770966,
"acc_norm": 0.7150259067357513,
"acc_norm_stderr": 0.0325771407770966
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.43846153846153846,
"acc_stderr": 0.025158266016868564,
"acc_norm": 0.43846153846153846,
"acc_norm_stderr": 0.025158266016868564
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945277,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945277
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4327731092436975,
"acc_stderr": 0.03218358107742613,
"acc_norm": 0.4327731092436975,
"acc_norm_stderr": 0.03218358107742613
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.634862385321101,
"acc_stderr": 0.02064280145438401,
"acc_norm": 0.634862385321101,
"acc_norm_stderr": 0.02064280145438401
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.030058202704309846,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.030058202704309846
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5637254901960784,
"acc_stderr": 0.034806931384570396,
"acc_norm": 0.5637254901960784,
"acc_norm_stderr": 0.034806931384570396
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6118143459915611,
"acc_stderr": 0.031722950043323296,
"acc_norm": 0.6118143459915611,
"acc_norm_stderr": 0.031722950043323296
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5605381165919282,
"acc_stderr": 0.03331092511038179,
"acc_norm": 0.5605381165919282,
"acc_norm_stderr": 0.03331092511038179
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.04328577215262972,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.04328577215262972
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.0436923632657398,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.0436923632657398
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.048262172941398944,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.048262172941398944
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.50920245398773,
"acc_stderr": 0.03927705600787443,
"acc_norm": 0.50920245398773,
"acc_norm_stderr": 0.03927705600787443
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.5436893203883495,
"acc_stderr": 0.04931801994220416,
"acc_norm": 0.5436893203883495,
"acc_norm_stderr": 0.04931801994220416
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6837606837606838,
"acc_stderr": 0.03046365674734027,
"acc_norm": 0.6837606837606838,
"acc_norm_stderr": 0.03046365674734027
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6398467432950191,
"acc_stderr": 0.017166362471369306,
"acc_norm": 0.6398467432950191,
"acc_norm_stderr": 0.017166362471369306
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5,
"acc_stderr": 0.026919095102908273,
"acc_norm": 0.5,
"acc_norm_stderr": 0.026919095102908273
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4803921568627451,
"acc_stderr": 0.028607893699576066,
"acc_norm": 0.4803921568627451,
"acc_norm_stderr": 0.028607893699576066
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5787781350482315,
"acc_stderr": 0.028043399858210628,
"acc_norm": 0.5787781350482315,
"acc_norm_stderr": 0.028043399858210628
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5061728395061729,
"acc_stderr": 0.027818623962583295,
"acc_norm": 0.5061728395061729,
"acc_norm_stderr": 0.027818623962583295
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36524822695035464,
"acc_stderr": 0.028723863853281278,
"acc_norm": 0.36524822695035464,
"acc_norm_stderr": 0.028723863853281278
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.37222946544980445,
"acc_stderr": 0.01234624129720437,
"acc_norm": 0.37222946544980445,
"acc_norm_stderr": 0.01234624129720437
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5257352941176471,
"acc_stderr": 0.03033257809455504,
"acc_norm": 0.5257352941176471,
"acc_norm_stderr": 0.03033257809455504
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.45751633986928103,
"acc_stderr": 0.02015468571259089,
"acc_norm": 0.45751633986928103,
"acc_norm_stderr": 0.02015468571259089
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.49387755102040815,
"acc_stderr": 0.03200682020163908,
"acc_norm": 0.49387755102040815,
"acc_norm_stderr": 0.03200682020163908
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6417910447761194,
"acc_stderr": 0.03390393042268813,
"acc_norm": 0.6417910447761194,
"acc_norm_stderr": 0.03390393042268813
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079023,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079023
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7134502923976608,
"acc_stderr": 0.03467826685703826,
"acc_norm": 0.7134502923976608,
"acc_norm_stderr": 0.03467826685703826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24479804161566707,
"mc1_stderr": 0.015051869486715014,
"mc2": 0.38637509679052146,
"mc2_stderr": 0.013509815622124081
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
zakcroft/test_lamini_docs | 2023-08-29T20:01:24.000Z | [
"region:us"
] | zakcroft | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 4054.5
num_examples: 5
- name: test
num_bytes: 4054.5
num_examples: 5
download_size: 12911
dataset_size: 8109.0
---
# Dataset Card for "test_lamini_docs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AnimaLab/bias-test-gpt-biases | 2023-10-10T09:41:05.000Z | [
"language:en",
"license:apache-2.0",
"arxiv:1906.07337",
"region:us"
] | AnimaLab | null | null | null | 0 | 0 | ---
license: apache-2.0
language:
- en
pretty_name: BiasTestGPT-sentences
---
# Dataset Card for "BiasTestGPT: Bias Specifications"
Dataset of sentences for bias testing in open-sourced Pretrained Language Models generated using ChatGPT and other generative Language Models.
This dataset is used and actively populated by the [BiasTestGPT HuggingFace Tool](https://huggingface.co/spaces/AnimaLab/bias-test-gpt-pairs).
- [BiasTestGPT HuggingFace Tool](https://huggingface.co/spaces/AnimaLab/bias-test-gpt-pairs)
- [Dataset with Generated Test Sentences](https://huggingface.co/datasets/AnimaLab/bias-test-gpt-sentences)
- [Project Landing Page](https://biastest-animalab.github.io/)
## Data Structure
### Data Instances
Dataset instances consist of JSON files with bias specifications.
There are 2 types of specifications kept in separate folders:
- **'predefined biases'** - predefined bias specifications are obtained from prior work such as [Caliskan'17](https://www.science.org/doi/full/10.1126/science.aal4230?casa_token=c7LtvGWRQqIAAAAA:5QJ_M6jLk3675-50ZIIGgO2XQ5ewtZZxnKW-NlqUmxT6H112t4PIptRA94JLswFcXYIyl3yMML86duQ) and [Kurita'19](https://arxiv.org/pdf/1906.07337.pdf).
- **'custom biases'** - are biases defined by the users of our [HuggingFace Tool](https://huggingface.co/spaces/AnimaLab/bias-test-gpt-pairs). These are dynamically added to the repository based on user interactions with the tool.
An example of predefined bias specification in JSON format is provided below:
```json
{ "name": "Career_Family_vs_MaleName_FemaleName",
"social_groups": {
"Male_Names": ["John", "Paul", "Mike", "Kevin", "Steve", "Greg", "Jeff", "Bill"],
"Female_Names": ["Amy", "Joan", "Lisa", "Sarah", "Diana", "Kate", "Ann", "Donna"]
},
"attributes": {
"Career": ["executive", "management", "professional", "corporation", "salary", "office", "business", "career"],
"Family": ["home", "parents", "children", "family", "cousins", "marriage", "wedding", "relatives"]
},
"templates": ["[T] likes [A]", "[T] like [A]", "[T] is interested in [A]"],
"source": "Caliskan'17 - Weat 6",
"url": "https://purehost.bath.ac.uk/ws/portalfiles/portal/168480066/CaliskanEtAl_authors_full.pdf",
"type": "predefined",
"created": "09/06/2023 09:58:37"
}
```
### Data Fields
Each bias specification contains several data fields defining the bias to test itself, but also additional metadata about the bias specification origins.
#### Description of Data Fields in the Bias Specification
- **'name'**: a `string` feature - a short descriptive name of the bias.
- **'social_groups'**: a `dictionary` feature containing specifications of 2 or more social groups to compare.
- **'Male_Names'**, **'Female_Names'** - bias specific *keys* with names defininig the compared social groups. Value for each kay is a list of terms defining the particular social group.
- **'attributes'**: a `dictionary' feature containing specifications of 2 ideally polar opposite attributes to test in comparison of social groups.
- **'Career'**, **`Family'** - bias specific *keys* with names of opposing attributes. Value for each key is a list of terms defining the attribute.
- **'templates'**: a 'list' feature - legacy test sentence templates used in prior work. Used for a baseline bias measurement.
- **'source'**: a 'string' feature - the source of the bias specification, usually prior work
- **'url'**: a `string' feature - link to the research paper providing the bias specification
- **'type'**: a `string' feature - specifies whether bias has been predefined by prior work or defined using our [HuggingFace Tool](https://huggingface.co/spaces/AnimaLab/bias-test-gpt-pairs).
- **'created'**: a data of addition of the bias specification to the repository. Generated automatically upon addition from our tool.
### Bias Specification - Data Splits
The repository contains 15 predefined bias specifications based on prior work and an additional 4 or more custom-defined bias specifications.
We note that the number of custom-defined bias specifications is constantly growing as it is being populated by the interactions with the [HuggingFace Tool](https://huggingface.co/spaces/AnimaLab/bias-test-gpt-pairs).
| Type | Meaning | Size |
|--------|--------|------:|
| predefined | biases for which specification has been provided in prior work | 15 |
| custom | biases added to the repository based on interaction with the [BiasTestGPT tool](https://huggingface.co/spaces/AnimaLab/bias-test-gpt-pairs) | 4+ | |
ConsciousEnergies/LENR_Text_Corpus | 2023-08-29T19:57:35.000Z | [
"license:gpl-3.0",
"region:us"
] | ConsciousEnergies | null | null | null | 0 | 0 | ---
license: gpl-3.0
---
|
isashap/airesumesplits | 2023-08-29T19:59:40.000Z | [
"region:us"
] | isashap | null | null | null | 0 | 0 | |
Benson/testfinetuneguanaco | 2023-08-29T20:11:04.000Z | [
"region:us"
] | Benson | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-dolphin_5w | 2023-08-29T20:19:02.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-dolphin_5w
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-dolphin_5w](https://huggingface.co/CHIH-HUNG/llama-2-13b-dolphin_5w)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-dolphin_5w\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T20:17:39.109039](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-dolphin_5w/blob/main/results_2023-08-29T20%3A17%3A39.109039.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5633146375797443,\n\
\ \"acc_stderr\": 0.03425471883492036,\n \"acc_norm\": 0.5675228553540953,\n\
\ \"acc_norm_stderr\": 0.034233038089217335,\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.016238065069059605,\n \"mc2\": 0.44414329125975355,\n\
\ \"mc2_stderr\": 0.01487426061645184\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5656996587030717,\n \"acc_stderr\": 0.01448470304885736,\n\
\ \"acc_norm\": 0.606655290102389,\n \"acc_norm_stderr\": 0.014275101465693026\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6195976897032464,\n\
\ \"acc_stderr\": 0.004844935327599204,\n \"acc_norm\": 0.8269269069906393,\n\
\ \"acc_norm_stderr\": 0.00377537291428549\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5333333333333333,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.5333333333333333,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.040335656678483184,\n\
\ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.040335656678483184\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6264150943396226,\n \"acc_stderr\": 0.029773082713319875,\n\
\ \"acc_norm\": 0.6264150943396226,\n \"acc_norm_stderr\": 0.029773082713319875\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5625,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.5625,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n\
\ \"acc_stderr\": 0.038073017265045125,\n \"acc_norm\": 0.5260115606936416,\n\
\ \"acc_norm_stderr\": 0.038073017265045125\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929777,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929777\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720685,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720685\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3412698412698413,\n \"acc_stderr\": 0.024419234966819067,\n \"\
acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.024419234966819067\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6580645161290323,\n\
\ \"acc_stderr\": 0.026985289576552742,\n \"acc_norm\": 0.6580645161290323,\n\
\ \"acc_norm_stderr\": 0.026985289576552742\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.0356796977226805,\n\
\ \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.0356796977226805\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\"\
: 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n\
\ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5025641025641026,\n \"acc_stderr\": 0.025350672979412195,\n\
\ \"acc_norm\": 0.5025641025641026,\n \"acc_norm_stderr\": 0.025350672979412195\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.02813325257881564,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.02813325257881564\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5756302521008403,\n \"acc_stderr\": 0.032104790510157764,\n\
\ \"acc_norm\": 0.5756302521008403,\n \"acc_norm_stderr\": 0.032104790510157764\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7431192660550459,\n \"acc_stderr\": 0.01873249292834246,\n \"\
acc_norm\": 0.7431192660550459,\n \"acc_norm_stderr\": 0.01873249292834246\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.47685185185185186,\n \"acc_stderr\": 0.034063153607115065,\n \"\
acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.034063153607115065\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7637130801687764,\n \"acc_stderr\": 0.02765215314415927,\n\
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.02765215314415927\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.041032038305145124,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.041032038305145124\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.03642914578292406,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.03642914578292406\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.04246624336697624,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.04246624336697624\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.782051282051282,\n\
\ \"acc_stderr\": 0.02704685763071669,\n \"acc_norm\": 0.782051282051282,\n\
\ \"acc_norm_stderr\": 0.02704685763071669\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7535121328224776,\n\
\ \"acc_stderr\": 0.015411308769686936,\n \"acc_norm\": 0.7535121328224776,\n\
\ \"acc_norm_stderr\": 0.015411308769686936\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.661849710982659,\n \"acc_stderr\": 0.02546977014940017,\n\
\ \"acc_norm\": 0.661849710982659,\n \"acc_norm_stderr\": 0.02546977014940017\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35977653631284917,\n\
\ \"acc_stderr\": 0.016051419760310263,\n \"acc_norm\": 0.35977653631284917,\n\
\ \"acc_norm_stderr\": 0.016051419760310263\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6241830065359477,\n \"acc_stderr\": 0.027732834353363947,\n\
\ \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.027732834353363947\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6591639871382636,\n\
\ \"acc_stderr\": 0.026920841260776165,\n \"acc_norm\": 0.6591639871382636,\n\
\ \"acc_norm_stderr\": 0.026920841260776165\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6450617283950617,\n \"acc_stderr\": 0.026624152478845853,\n\
\ \"acc_norm\": 0.6450617283950617,\n \"acc_norm_stderr\": 0.026624152478845853\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370593,\n \
\ \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370593\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4211212516297262,\n\
\ \"acc_stderr\": 0.012610325733489905,\n \"acc_norm\": 0.4211212516297262,\n\
\ \"acc_norm_stderr\": 0.012610325733489905\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5073529411764706,\n \"acc_stderr\": 0.030369552523902173,\n\
\ \"acc_norm\": 0.5073529411764706,\n \"acc_norm_stderr\": 0.030369552523902173\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.545751633986928,\n \"acc_stderr\": 0.020142974553795205,\n \
\ \"acc_norm\": 0.545751633986928,\n \"acc_norm_stderr\": 0.020142974553795205\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6571428571428571,\n \"acc_stderr\": 0.030387262919547728,\n\
\ \"acc_norm\": 0.6571428571428571,\n \"acc_norm_stderr\": 0.030387262919547728\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n\
\ \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.7512437810945274,\n\
\ \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.016238065069059605,\n \"mc2\": 0.44414329125975355,\n\
\ \"mc2_stderr\": 0.01487426061645184\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-dolphin_5w
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|arc:challenge|25_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hellaswag|10_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:17:39.109039.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:17:39.109039.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T20:17:39.109039.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T20:17:39.109039.parquet'
- config_name: results
data_files:
- split: 2023_08_29T20_17_39.109039
path:
- results_2023-08-29T20:17:39.109039.parquet
- split: latest
path:
- results_2023-08-29T20:17:39.109039.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-dolphin_5w
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-dolphin_5w
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-dolphin_5w](https://huggingface.co/CHIH-HUNG/llama-2-13b-dolphin_5w) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-dolphin_5w",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T20:17:39.109039](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-dolphin_5w/blob/main/results_2023-08-29T20%3A17%3A39.109039.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5633146375797443,
"acc_stderr": 0.03425471883492036,
"acc_norm": 0.5675228553540953,
"acc_norm_stderr": 0.034233038089217335,
"mc1": 0.31334149326805383,
"mc1_stderr": 0.016238065069059605,
"mc2": 0.44414329125975355,
"mc2_stderr": 0.01487426061645184
},
"harness|arc:challenge|25": {
"acc": 0.5656996587030717,
"acc_stderr": 0.01448470304885736,
"acc_norm": 0.606655290102389,
"acc_norm_stderr": 0.014275101465693026
},
"harness|hellaswag|10": {
"acc": 0.6195976897032464,
"acc_stderr": 0.004844935327599204,
"acc_norm": 0.8269269069906393,
"acc_norm_stderr": 0.00377537291428549
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.040335656678483184,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.040335656678483184
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6264150943396226,
"acc_stderr": 0.029773082713319875,
"acc_norm": 0.6264150943396226,
"acc_norm_stderr": 0.029773082713319875
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5625,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.038073017265045125,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.038073017265045125
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929777,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720685,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720685
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4127659574468085,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.4127659574468085,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.024419234966819067,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.024419234966819067
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6580645161290323,
"acc_stderr": 0.026985289576552742,
"acc_norm": 0.6580645161290323,
"acc_norm_stderr": 0.026985289576552742
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.703030303030303,
"acc_stderr": 0.0356796977226805,
"acc_norm": 0.703030303030303,
"acc_norm_stderr": 0.0356796977226805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5025641025641026,
"acc_stderr": 0.025350672979412195,
"acc_norm": 0.5025641025641026,
"acc_norm_stderr": 0.025350672979412195
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.02813325257881564,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.02813325257881564
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5756302521008403,
"acc_stderr": 0.032104790510157764,
"acc_norm": 0.5756302521008403,
"acc_norm_stderr": 0.032104790510157764
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7431192660550459,
"acc_stderr": 0.01873249292834246,
"acc_norm": 0.7431192660550459,
"acc_norm_stderr": 0.01873249292834246
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.034063153607115065,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.034063153607115065
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.02765215314415927,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.02765215314415927
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.041032038305145124,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.041032038305145124
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.03642914578292406,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.03642914578292406
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.04246624336697624,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.04246624336697624
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.782051282051282,
"acc_stderr": 0.02704685763071669,
"acc_norm": 0.782051282051282,
"acc_norm_stderr": 0.02704685763071669
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7535121328224776,
"acc_stderr": 0.015411308769686936,
"acc_norm": 0.7535121328224776,
"acc_norm_stderr": 0.015411308769686936
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.661849710982659,
"acc_stderr": 0.02546977014940017,
"acc_norm": 0.661849710982659,
"acc_norm_stderr": 0.02546977014940017
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35977653631284917,
"acc_stderr": 0.016051419760310263,
"acc_norm": 0.35977653631284917,
"acc_norm_stderr": 0.016051419760310263
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.027732834353363947,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.027732834353363947
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6591639871382636,
"acc_stderr": 0.026920841260776165,
"acc_norm": 0.6591639871382636,
"acc_norm_stderr": 0.026920841260776165
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6450617283950617,
"acc_stderr": 0.026624152478845853,
"acc_norm": 0.6450617283950617,
"acc_norm_stderr": 0.026624152478845853
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370593,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370593
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4211212516297262,
"acc_stderr": 0.012610325733489905,
"acc_norm": 0.4211212516297262,
"acc_norm_stderr": 0.012610325733489905
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5073529411764706,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.5073529411764706,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.545751633986928,
"acc_stderr": 0.020142974553795205,
"acc_norm": 0.545751633986928,
"acc_norm_stderr": 0.020142974553795205
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6571428571428571,
"acc_stderr": 0.030387262919547728,
"acc_norm": 0.6571428571428571,
"acc_norm_stderr": 0.030387262919547728
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31334149326805383,
"mc1_stderr": 0.016238065069059605,
"mc2": 0.44414329125975355,
"mc2_stderr": 0.01487426061645184
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_StudentLLM__Alpagasus-2-13B-QLoRA-pipeline | 2023-09-16T19:25:18.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of StudentLLM/Alpagasus-2-13B-QLoRA-pipeline
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [StudentLLM/Alpagasus-2-13B-QLoRA-pipeline](https://huggingface.co/StudentLLM/Alpagasus-2-13B-QLoRA-pipeline)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_StudentLLM__Alpagasus-2-13B-QLoRA-pipeline\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-16T19:25:05.915541](https://huggingface.co/datasets/open-llm-leaderboard/details_StudentLLM__Alpagasus-2-13B-QLoRA-pipeline/blob/main/results_2023-09-16T19-25-05.915541.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0017827181208053692,\n\
\ \"em_stderr\": 0.0004320097346038763,\n \"f1\": 0.06066065436241617,\n\
\ \"f1_stderr\": 0.0013657607384917807,\n \"acc\": 0.42588409458506093,\n\
\ \"acc_stderr\": 0.009997978043392182\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0017827181208053692,\n \"em_stderr\": 0.0004320097346038763,\n\
\ \"f1\": 0.06066065436241617,\n \"f1_stderr\": 0.0013657607384917807\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09249431387414708,\n \
\ \"acc_stderr\": 0.007980396874560178\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7592738752959748,\n \"acc_stderr\": 0.012015559212224185\n\
\ }\n}\n```"
repo_url: https://huggingface.co/StudentLLM/Alpagasus-2-13B-QLoRA-pipeline
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|arc:challenge|25_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_16T19_25_05.915541
path:
- '**/details_harness|drop|3_2023-09-16T19-25-05.915541.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-16T19-25-05.915541.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_16T19_25_05.915541
path:
- '**/details_harness|gsm8k|5_2023-09-16T19-25-05.915541.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-16T19-25-05.915541.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hellaswag|10_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:19:48.806105.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T20:19:48.806105.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T20:19:48.806105.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_16T19_25_05.915541
path:
- '**/details_harness|winogrande|5_2023-09-16T19-25-05.915541.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-16T19-25-05.915541.parquet'
- config_name: results
data_files:
- split: 2023_08_29T20_19_48.806105
path:
- results_2023-08-29T20:19:48.806105.parquet
- split: 2023_09_16T19_25_05.915541
path:
- results_2023-09-16T19-25-05.915541.parquet
- split: latest
path:
- results_2023-09-16T19-25-05.915541.parquet
---
# Dataset Card for Evaluation run of StudentLLM/Alpagasus-2-13B-QLoRA-pipeline
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/StudentLLM/Alpagasus-2-13B-QLoRA-pipeline
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [StudentLLM/Alpagasus-2-13B-QLoRA-pipeline](https://huggingface.co/StudentLLM/Alpagasus-2-13B-QLoRA-pipeline) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_StudentLLM__Alpagasus-2-13B-QLoRA-pipeline",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T19:25:05.915541](https://huggingface.co/datasets/open-llm-leaderboard/details_StudentLLM__Alpagasus-2-13B-QLoRA-pipeline/blob/main/results_2023-09-16T19-25-05.915541.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0017827181208053692,
"em_stderr": 0.0004320097346038763,
"f1": 0.06066065436241617,
"f1_stderr": 0.0013657607384917807,
"acc": 0.42588409458506093,
"acc_stderr": 0.009997978043392182
},
"harness|drop|3": {
"em": 0.0017827181208053692,
"em_stderr": 0.0004320097346038763,
"f1": 0.06066065436241617,
"f1_stderr": 0.0013657607384917807
},
"harness|gsm8k|5": {
"acc": 0.09249431387414708,
"acc_stderr": 0.007980396874560178
},
"harness|winogrande|5": {
"acc": 0.7592738752959748,
"acc_stderr": 0.012015559212224185
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Seenka/directv-zocalos-agosto-5fps_vectors | 2023-08-29T20:31:54.000Z | [
"region:us"
] | Seenka | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: image
dtype: image
- name: image_filename
dtype: string
- name: frame_time
dtype: time64[us]
- name: video_storage_path
dtype: string
- name: zocalo_id
dtype: string
- name: frame_number
dtype: int64
- name: is_L_shape
dtype: bool
- name: horizontal_check
dtype: bool
- name: vertical_check
dtype: bool
- name: black_image
dtype: bool
- name: horizontal_xmin
dtype: int64
- name: horizontal_xmax
dtype: int64
- name: horizontal_ymin
dtype: int64
- name: horizontal_ymax
dtype: int64
- name: vertical_xmin
dtype: int64
- name: vertical_xmax
dtype: int64
- name: vertical_ymin
dtype: int64
- name: vertical_ymax
dtype: int64
- name: cropped_image_horizontal
dtype: image
- name: cropped_image_vertical
dtype: image
- name: width
dtype: int64
- name: height
dtype: int64
- name: embedding_horizontal
sequence: float32
- name: embedding_vertical
sequence: float32
splits:
- name: train
num_bytes: 135001798.0
num_examples: 150
download_size: 133999646
dataset_size: 135001798.0
---
# Dataset Card for "directv-zocalos-agosto-5fps_vectors"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Sao10K__Mythical-Destroyer-L2-13B | 2023-09-22T17:08:19.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Sao10K/Mythical-Destroyer-L2-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/Mythical-Destroyer-L2-13B](https://huggingface.co/Sao10K/Mythical-Destroyer-L2-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Mythical-Destroyer-L2-13B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T17:08:07.137217](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Mythical-Destroyer-L2-13B/blob/main/results_2023-09-22T17-08-07.137217.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.016254194630872482,\n\
\ \"em_stderr\": 0.0012949822806761588,\n \"f1\": 0.12557990771812008,\n\
\ \"f1_stderr\": 0.0022451692243357826,\n \"acc\": 0.4180536664965267,\n\
\ \"acc_stderr\": 0.010042668742086671\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.016254194630872482,\n \"em_stderr\": 0.0012949822806761588,\n\
\ \"f1\": 0.12557990771812008,\n \"f1_stderr\": 0.0022451692243357826\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08946171341925702,\n \
\ \"acc_stderr\": 0.007861583049939723\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7466456195737964,\n \"acc_stderr\": 0.012223754434233618\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Sao10K/Mythical-Destroyer-L2-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|arc:challenge|25_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T17_08_07.137217
path:
- '**/details_harness|drop|3_2023-09-22T17-08-07.137217.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T17-08-07.137217.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T17_08_07.137217
path:
- '**/details_harness|gsm8k|5_2023-09-22T17-08-07.137217.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T17-08-07.137217.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hellaswag|10_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:22:32.646135.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T20:22:32.646135.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T20:22:32.646135.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T17_08_07.137217
path:
- '**/details_harness|winogrande|5_2023-09-22T17-08-07.137217.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T17-08-07.137217.parquet'
- config_name: results
data_files:
- split: 2023_08_29T20_22_32.646135
path:
- results_2023-08-29T20:22:32.646135.parquet
- split: 2023_09_22T17_08_07.137217
path:
- results_2023-09-22T17-08-07.137217.parquet
- split: latest
path:
- results_2023-09-22T17-08-07.137217.parquet
---
# Dataset Card for Evaluation run of Sao10K/Mythical-Destroyer-L2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Sao10K/Mythical-Destroyer-L2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Sao10K/Mythical-Destroyer-L2-13B](https://huggingface.co/Sao10K/Mythical-Destroyer-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Mythical-Destroyer-L2-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T17:08:07.137217](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Mythical-Destroyer-L2-13B/blob/main/results_2023-09-22T17-08-07.137217.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.016254194630872482,
"em_stderr": 0.0012949822806761588,
"f1": 0.12557990771812008,
"f1_stderr": 0.0022451692243357826,
"acc": 0.4180536664965267,
"acc_stderr": 0.010042668742086671
},
"harness|drop|3": {
"em": 0.016254194630872482,
"em_stderr": 0.0012949822806761588,
"f1": 0.12557990771812008,
"f1_stderr": 0.0022451692243357826
},
"harness|gsm8k|5": {
"acc": 0.08946171341925702,
"acc_stderr": 0.007861583049939723
},
"harness|winogrande|5": {
"acc": 0.7466456195737964,
"acc_stderr": 0.012223754434233618
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Masterjp123/Furry-Model | 2023-09-08T22:13:16.000Z | [
"license:unknown",
"region:us"
] | Masterjp123 | null | null | null | 0 | 0 | ---
license: unknown
---
|
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-dolphin_20w | 2023-08-29T20:31:16.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-dolphin_20w
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-dolphin_20w](https://huggingface.co/CHIH-HUNG/llama-2-13b-dolphin_20w)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-dolphin_20w\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T20:29:52.099975](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-dolphin_20w/blob/main/results_2023-08-29T20%3A29%3A52.099975.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5598686666287511,\n\
\ \"acc_stderr\": 0.03431185212908762,\n \"acc_norm\": 0.5640870113614027,\n\
\ \"acc_norm_stderr\": 0.03429076080604556,\n \"mc1\": 0.30599755201958384,\n\
\ \"mc1_stderr\": 0.01613222972815504,\n \"mc2\": 0.42666404895060756,\n\
\ \"mc2_stderr\": 0.01463198138552924\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5546075085324232,\n \"acc_stderr\": 0.014523987638344088,\n\
\ \"acc_norm\": 0.5955631399317406,\n \"acc_norm_stderr\": 0.014342036483436177\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6176060545708026,\n\
\ \"acc_stderr\": 0.004849788423944359,\n \"acc_norm\": 0.8255327623979287,\n\
\ \"acc_norm_stderr\": 0.003787351519370806\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411022,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411022\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249033,\n\
\ \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249033\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6377358490566037,\n \"acc_stderr\": 0.029582245128384303,\n\
\ \"acc_norm\": 0.6377358490566037,\n \"acc_norm_stderr\": 0.029582245128384303\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5763888888888888,\n\
\ \"acc_stderr\": 0.041321250197233685,\n \"acc_norm\": 0.5763888888888888,\n\
\ \"acc_norm_stderr\": 0.041321250197233685\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808777,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808777\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03202563076101735,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.0242785680243077,\n \"acc_norm\"\
: 0.3333333333333333,\n \"acc_norm_stderr\": 0.0242785680243077\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6741935483870968,\n\
\ \"acc_stderr\": 0.026662010578567107,\n \"acc_norm\": 0.6741935483870968,\n\
\ \"acc_norm_stderr\": 0.026662010578567107\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.035145285621750094,\n\
\ \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.035145285621750094\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.036277305750224094,\n\
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.036277305750224094\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7171717171717171,\n \"acc_stderr\": 0.03208779558786752,\n \"\
acc_norm\": 0.7171717171717171,\n \"acc_norm_stderr\": 0.03208779558786752\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.02840895362624526,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.02840895362624526\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5102564102564102,\n \"acc_stderr\": 0.025345672221942374,\n\
\ \"acc_norm\": 0.5102564102564102,\n \"acc_norm_stderr\": 0.025345672221942374\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073838,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073838\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5504201680672269,\n \"acc_stderr\": 0.03231293497137707,\n \
\ \"acc_norm\": 0.5504201680672269,\n \"acc_norm_stderr\": 0.03231293497137707\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7394495412844037,\n \"acc_stderr\": 0.01881918203485007,\n \"\
acc_norm\": 0.7394495412844037,\n \"acc_norm_stderr\": 0.01881918203485007\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502327,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502327\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460302,\n \
\ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460302\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.036429145782924055,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.036429145782924055\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04287858751340455,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04287858751340455\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.782051282051282,\n\
\ \"acc_stderr\": 0.02704685763071668,\n \"acc_norm\": 0.782051282051282,\n\
\ \"acc_norm_stderr\": 0.02704685763071668\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.768837803320562,\n\
\ \"acc_stderr\": 0.01507552323810107,\n \"acc_norm\": 0.768837803320562,\n\
\ \"acc_norm_stderr\": 0.01507552323810107\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.025906632631016124,\n\
\ \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.025906632631016124\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40893854748603353,\n\
\ \"acc_stderr\": 0.01644283065471554,\n \"acc_norm\": 0.40893854748603353,\n\
\ \"acc_norm_stderr\": 0.01644283065471554\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6013071895424836,\n \"acc_stderr\": 0.028036092273891765,\n\
\ \"acc_norm\": 0.6013071895424836,\n \"acc_norm_stderr\": 0.028036092273891765\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6366559485530546,\n\
\ \"acc_stderr\": 0.02731684767419271,\n \"acc_norm\": 0.6366559485530546,\n\
\ \"acc_norm_stderr\": 0.02731684767419271\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6327160493827161,\n \"acc_stderr\": 0.026822801759507894,\n\
\ \"acc_norm\": 0.6327160493827161,\n \"acc_norm_stderr\": 0.026822801759507894\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4148936170212766,\n \"acc_stderr\": 0.029392236584612503,\n \
\ \"acc_norm\": 0.4148936170212766,\n \"acc_norm_stderr\": 0.029392236584612503\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4132985658409387,\n\
\ \"acc_stderr\": 0.012576779494860088,\n \"acc_norm\": 0.4132985658409387,\n\
\ \"acc_norm_stderr\": 0.012576779494860088\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.03035230339535196,\n\
\ \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.03035230339535196\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5571895424836601,\n \"acc_stderr\": 0.020095083154577344,\n \
\ \"acc_norm\": 0.5571895424836601,\n \"acc_norm_stderr\": 0.020095083154577344\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.03100120903989484,\n\
\ \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.03100120903989484\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7562189054726368,\n\
\ \"acc_stderr\": 0.030360490154014645,\n \"acc_norm\": 0.7562189054726368,\n\
\ \"acc_norm_stderr\": 0.030360490154014645\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.0330140594698725,\n\
\ \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.0330140594698725\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30599755201958384,\n\
\ \"mc1_stderr\": 0.01613222972815504,\n \"mc2\": 0.42666404895060756,\n\
\ \"mc2_stderr\": 0.01463198138552924\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-dolphin_20w
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|arc:challenge|25_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hellaswag|10_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:29:52.099975.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:29:52.099975.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T20:29:52.099975.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T20:29:52.099975.parquet'
- config_name: results
data_files:
- split: 2023_08_29T20_29_52.099975
path:
- results_2023-08-29T20:29:52.099975.parquet
- split: latest
path:
- results_2023-08-29T20:29:52.099975.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-dolphin_20w
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-dolphin_20w
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-dolphin_20w](https://huggingface.co/CHIH-HUNG/llama-2-13b-dolphin_20w) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-dolphin_20w",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T20:29:52.099975](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-dolphin_20w/blob/main/results_2023-08-29T20%3A29%3A52.099975.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5598686666287511,
"acc_stderr": 0.03431185212908762,
"acc_norm": 0.5640870113614027,
"acc_norm_stderr": 0.03429076080604556,
"mc1": 0.30599755201958384,
"mc1_stderr": 0.01613222972815504,
"mc2": 0.42666404895060756,
"mc2_stderr": 0.01463198138552924
},
"harness|arc:challenge|25": {
"acc": 0.5546075085324232,
"acc_stderr": 0.014523987638344088,
"acc_norm": 0.5955631399317406,
"acc_norm_stderr": 0.014342036483436177
},
"harness|hellaswag|10": {
"acc": 0.6176060545708026,
"acc_stderr": 0.004849788423944359,
"acc_norm": 0.8255327623979287,
"acc_norm_stderr": 0.003787351519370806
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411022,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411022
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5394736842105263,
"acc_stderr": 0.04056242252249033,
"acc_norm": 0.5394736842105263,
"acc_norm_stderr": 0.04056242252249033
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6377358490566037,
"acc_stderr": 0.029582245128384303,
"acc_norm": 0.6377358490566037,
"acc_norm_stderr": 0.029582245128384303
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5763888888888888,
"acc_stderr": 0.041321250197233685,
"acc_norm": 0.5763888888888888,
"acc_norm_stderr": 0.041321250197233685
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808777,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.0242785680243077,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.0242785680243077
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6741935483870968,
"acc_stderr": 0.026662010578567107,
"acc_norm": 0.6741935483870968,
"acc_norm_stderr": 0.026662010578567107
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.035145285621750094,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.035145285621750094
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.036277305750224094,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.036277305750224094
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7171717171717171,
"acc_stderr": 0.03208779558786752,
"acc_norm": 0.7171717171717171,
"acc_norm_stderr": 0.03208779558786752
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.02840895362624526,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.02840895362624526
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5102564102564102,
"acc_stderr": 0.025345672221942374,
"acc_norm": 0.5102564102564102,
"acc_norm_stderr": 0.025345672221942374
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073838,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073838
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5504201680672269,
"acc_stderr": 0.03231293497137707,
"acc_norm": 0.5504201680672269,
"acc_norm_stderr": 0.03231293497137707
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7394495412844037,
"acc_stderr": 0.01881918203485007,
"acc_norm": 0.7394495412844037,
"acc_norm_stderr": 0.01881918203485007
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502327,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502327
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.028458820991460302,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.028458820991460302
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.036429145782924055,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.036429145782924055
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340455,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340455
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.782051282051282,
"acc_stderr": 0.02704685763071668,
"acc_norm": 0.782051282051282,
"acc_norm_stderr": 0.02704685763071668
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.768837803320562,
"acc_stderr": 0.01507552323810107,
"acc_norm": 0.768837803320562,
"acc_norm_stderr": 0.01507552323810107
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.025906632631016124,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.025906632631016124
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40893854748603353,
"acc_stderr": 0.01644283065471554,
"acc_norm": 0.40893854748603353,
"acc_norm_stderr": 0.01644283065471554
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6013071895424836,
"acc_stderr": 0.028036092273891765,
"acc_norm": 0.6013071895424836,
"acc_norm_stderr": 0.028036092273891765
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6366559485530546,
"acc_stderr": 0.02731684767419271,
"acc_norm": 0.6366559485530546,
"acc_norm_stderr": 0.02731684767419271
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6327160493827161,
"acc_stderr": 0.026822801759507894,
"acc_norm": 0.6327160493827161,
"acc_norm_stderr": 0.026822801759507894
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4148936170212766,
"acc_stderr": 0.029392236584612503,
"acc_norm": 0.4148936170212766,
"acc_norm_stderr": 0.029392236584612503
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4132985658409387,
"acc_stderr": 0.012576779494860088,
"acc_norm": 0.4132985658409387,
"acc_norm_stderr": 0.012576779494860088
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5183823529411765,
"acc_stderr": 0.03035230339535196,
"acc_norm": 0.5183823529411765,
"acc_norm_stderr": 0.03035230339535196
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5571895424836601,
"acc_stderr": 0.020095083154577344,
"acc_norm": 0.5571895424836601,
"acc_norm_stderr": 0.020095083154577344
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7562189054726368,
"acc_stderr": 0.030360490154014645,
"acc_norm": 0.7562189054726368,
"acc_norm_stderr": 0.030360490154014645
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7543859649122807,
"acc_stderr": 0.0330140594698725,
"acc_norm": 0.7543859649122807,
"acc_norm_stderr": 0.0330140594698725
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30599755201958384,
"mc1_stderr": 0.01613222972815504,
"mc2": 0.42666404895060756,
"mc2_stderr": 0.01463198138552924
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mrmocciai/pre-trained-duplicate | 2023-09-02T17:54:55.000Z | [
"task_categories:feature-extraction",
"language:en",
"license:mit",
"region:us"
] | mrmocciai | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- feature-extraction
language:
- en
---
# This file are pre-trained rvc for training models.<br />
## This files are not mine, i just backing up this file due to colab new term of service that making rvc can't run in google colab.<br />
### Credit owner<br />
[Liu](https://huggingface.co/lj1995)<br /> |
open-llm-leaderboard/details_yeontaek__airoboros-2.1-llama-2-13B-QLoRa | 2023-08-29T20:39:52.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of yeontaek/airoboros-2.1-llama-2-13B-QLoRa
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeontaek/airoboros-2.1-llama-2-13B-QLoRa](https://huggingface.co/yeontaek/airoboros-2.1-llama-2-13B-QLoRa)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__airoboros-2.1-llama-2-13B-QLoRa\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T20:38:29.069623](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__airoboros-2.1-llama-2-13B-QLoRa/blob/main/results_2023-08-29T20%3A38%3A29.069623.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5494715237795186,\n\
\ \"acc_stderr\": 0.03458421881614154,\n \"acc_norm\": 0.5533130598319451,\n\
\ \"acc_norm_stderr\": 0.03456382482357915,\n \"mc1\": 0.31211750305997554,\n\
\ \"mc1_stderr\": 0.01622075676952093,\n \"mc2\": 0.4514307596819309,\n\
\ \"mc2_stderr\": 0.015825884159192616\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5656996587030717,\n \"acc_stderr\": 0.014484703048857359,\n\
\ \"acc_norm\": 0.5972696245733788,\n \"acc_norm_stderr\": 0.014332236306790147\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6340370444134634,\n\
\ \"acc_stderr\": 0.004807146925162056,\n \"acc_norm\": 0.8291177056363275,\n\
\ \"acc_norm_stderr\": 0.003756368106048426\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249033,\n\
\ \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249033\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5811320754716981,\n \"acc_stderr\": 0.03036505082911521,\n\
\ \"acc_norm\": 0.5811320754716981,\n \"acc_norm_stderr\": 0.03036505082911521\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.04122728707651283,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.04122728707651283\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4913294797687861,\n\
\ \"acc_stderr\": 0.038118909889404126,\n \"acc_norm\": 0.4913294797687861,\n\
\ \"acc_norm_stderr\": 0.038118909889404126\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179328,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179328\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3439153439153439,\n \"acc_stderr\": 0.024464426625596433,\n \"\
acc_norm\": 0.3439153439153439,\n \"acc_norm_stderr\": 0.024464426625596433\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6483870967741936,\n\
\ \"acc_stderr\": 0.02716253782694846,\n \"acc_norm\": 0.6483870967741936,\n\
\ \"acc_norm_stderr\": 0.02716253782694846\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4433497536945813,\n \"acc_stderr\": 0.034953345821629345,\n\
\ \"acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.034953345821629345\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.036462049632538115,\n\
\ \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.036462049632538115\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.033586181457325226,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.033586181457325226\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.772020725388601,\n \"acc_stderr\": 0.030276909945178274,\n\
\ \"acc_norm\": 0.772020725388601,\n \"acc_norm_stderr\": 0.030276909945178274\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.517948717948718,\n \"acc_stderr\": 0.02533466708095492,\n \
\ \"acc_norm\": 0.517948717948718,\n \"acc_norm_stderr\": 0.02533466708095492\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712173,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712173\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5210084033613446,\n \"acc_stderr\": 0.03244980849990029,\n \
\ \"acc_norm\": 0.5210084033613446,\n \"acc_norm_stderr\": 0.03244980849990029\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7192660550458716,\n \"acc_stderr\": 0.019266055045871616,\n \"\
acc_norm\": 0.7192660550458716,\n \"acc_norm_stderr\": 0.019266055045871616\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.33796296296296297,\n \"acc_stderr\": 0.032259413526312945,\n \"\
acc_norm\": 0.33796296296296297,\n \"acc_norm_stderr\": 0.032259413526312945\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7130801687763713,\n \"acc_stderr\": 0.029443773022594693,\n \
\ \"acc_norm\": 0.7130801687763713,\n \"acc_norm_stderr\": 0.029443773022594693\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.600896860986547,\n\
\ \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.600896860986547,\n\
\ \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.782051282051282,\n\
\ \"acc_stderr\": 0.02704685763071668,\n \"acc_norm\": 0.782051282051282,\n\
\ \"acc_norm_stderr\": 0.02704685763071668\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7471264367816092,\n\
\ \"acc_stderr\": 0.015543377313719681,\n \"acc_norm\": 0.7471264367816092,\n\
\ \"acc_norm_stderr\": 0.015543377313719681\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5982658959537572,\n \"acc_stderr\": 0.026394104177643634,\n\
\ \"acc_norm\": 0.5982658959537572,\n \"acc_norm_stderr\": 0.026394104177643634\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.29497206703910617,\n\
\ \"acc_stderr\": 0.015251931579208183,\n \"acc_norm\": 0.29497206703910617,\n\
\ \"acc_norm_stderr\": 0.015251931579208183\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5816993464052288,\n \"acc_stderr\": 0.028245134024387306,\n\
\ \"acc_norm\": 0.5816993464052288,\n \"acc_norm_stderr\": 0.028245134024387306\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5916398713826366,\n\
\ \"acc_stderr\": 0.027917050748484624,\n \"acc_norm\": 0.5916398713826366,\n\
\ \"acc_norm_stderr\": 0.027917050748484624\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6203703703703703,\n \"acc_stderr\": 0.027002521034516468,\n\
\ \"acc_norm\": 0.6203703703703703,\n \"acc_norm_stderr\": 0.027002521034516468\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40425531914893614,\n \"acc_stderr\": 0.02927553215970473,\n \
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.02927553215970473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.40352020860495436,\n\
\ \"acc_stderr\": 0.012530241301193186,\n \"acc_norm\": 0.40352020860495436,\n\
\ \"acc_norm_stderr\": 0.012530241301193186\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.030332578094555026,\n\
\ \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.030332578094555026\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5392156862745098,\n \"acc_stderr\": 0.020165523313907904,\n \
\ \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.020165523313907904\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.636734693877551,\n \"acc_stderr\": 0.030789051139030806,\n\
\ \"acc_norm\": 0.636734693877551,\n \"acc_norm_stderr\": 0.030789051139030806\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n\
\ \"acc_stderr\": 0.03115715086935556,\n \"acc_norm\": 0.736318407960199,\n\
\ \"acc_norm_stderr\": 0.03115715086935556\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03377310252209205,\n\
\ \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03377310252209205\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31211750305997554,\n\
\ \"mc1_stderr\": 0.01622075676952093,\n \"mc2\": 0.4514307596819309,\n\
\ \"mc2_stderr\": 0.015825884159192616\n }\n}\n```"
repo_url: https://huggingface.co/yeontaek/airoboros-2.1-llama-2-13B-QLoRa
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|arc:challenge|25_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hellaswag|10_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:38:29.069623.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:38:29.069623.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T20:38:29.069623.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T20:38:29.069623.parquet'
- config_name: results
data_files:
- split: 2023_08_29T20_38_29.069623
path:
- results_2023-08-29T20:38:29.069623.parquet
- split: latest
path:
- results_2023-08-29T20:38:29.069623.parquet
---
# Dataset Card for Evaluation run of yeontaek/airoboros-2.1-llama-2-13B-QLoRa
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeontaek/airoboros-2.1-llama-2-13B-QLoRa
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeontaek/airoboros-2.1-llama-2-13B-QLoRa](https://huggingface.co/yeontaek/airoboros-2.1-llama-2-13B-QLoRa) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeontaek__airoboros-2.1-llama-2-13B-QLoRa",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T20:38:29.069623](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__airoboros-2.1-llama-2-13B-QLoRa/blob/main/results_2023-08-29T20%3A38%3A29.069623.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5494715237795186,
"acc_stderr": 0.03458421881614154,
"acc_norm": 0.5533130598319451,
"acc_norm_stderr": 0.03456382482357915,
"mc1": 0.31211750305997554,
"mc1_stderr": 0.01622075676952093,
"mc2": 0.4514307596819309,
"mc2_stderr": 0.015825884159192616
},
"harness|arc:challenge|25": {
"acc": 0.5656996587030717,
"acc_stderr": 0.014484703048857359,
"acc_norm": 0.5972696245733788,
"acc_norm_stderr": 0.014332236306790147
},
"harness|hellaswag|10": {
"acc": 0.6340370444134634,
"acc_stderr": 0.004807146925162056,
"acc_norm": 0.8291177056363275,
"acc_norm_stderr": 0.003756368106048426
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5394736842105263,
"acc_stderr": 0.04056242252249033,
"acc_norm": 0.5394736842105263,
"acc_norm_stderr": 0.04056242252249033
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5811320754716981,
"acc_stderr": 0.03036505082911521,
"acc_norm": 0.5811320754716981,
"acc_norm_stderr": 0.03036505082911521
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04122728707651283,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04122728707651283
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4913294797687861,
"acc_stderr": 0.038118909889404126,
"acc_norm": 0.4913294797687861,
"acc_norm_stderr": 0.038118909889404126
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179328,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179328
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.04164188720169377,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.04164188720169377
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3439153439153439,
"acc_stderr": 0.024464426625596433,
"acc_norm": 0.3439153439153439,
"acc_norm_stderr": 0.024464426625596433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574924,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574924
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6483870967741936,
"acc_stderr": 0.02716253782694846,
"acc_norm": 0.6483870967741936,
"acc_norm_stderr": 0.02716253782694846
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.034953345821629345,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.034953345821629345
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.036462049632538115,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.036462049632538115
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.033586181457325226,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.033586181457325226
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.772020725388601,
"acc_stderr": 0.030276909945178274,
"acc_norm": 0.772020725388601,
"acc_norm_stderr": 0.030276909945178274
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.517948717948718,
"acc_stderr": 0.02533466708095492,
"acc_norm": 0.517948717948718,
"acc_norm_stderr": 0.02533466708095492
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712173,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712173
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5210084033613446,
"acc_stderr": 0.03244980849990029,
"acc_norm": 0.5210084033613446,
"acc_norm_stderr": 0.03244980849990029
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7192660550458716,
"acc_stderr": 0.019266055045871616,
"acc_norm": 0.7192660550458716,
"acc_norm_stderr": 0.019266055045871616
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.33796296296296297,
"acc_stderr": 0.032259413526312945,
"acc_norm": 0.33796296296296297,
"acc_norm_stderr": 0.032259413526312945
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7130801687763713,
"acc_stderr": 0.029443773022594693,
"acc_norm": 0.7130801687763713,
"acc_norm_stderr": 0.029443773022594693
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.600896860986547,
"acc_stderr": 0.03286745312567961,
"acc_norm": 0.600896860986547,
"acc_norm_stderr": 0.03286745312567961
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.782051282051282,
"acc_stderr": 0.02704685763071668,
"acc_norm": 0.782051282051282,
"acc_norm_stderr": 0.02704685763071668
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7471264367816092,
"acc_stderr": 0.015543377313719681,
"acc_norm": 0.7471264367816092,
"acc_norm_stderr": 0.015543377313719681
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5982658959537572,
"acc_stderr": 0.026394104177643634,
"acc_norm": 0.5982658959537572,
"acc_norm_stderr": 0.026394104177643634
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.29497206703910617,
"acc_stderr": 0.015251931579208183,
"acc_norm": 0.29497206703910617,
"acc_norm_stderr": 0.015251931579208183
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5816993464052288,
"acc_stderr": 0.028245134024387306,
"acc_norm": 0.5816993464052288,
"acc_norm_stderr": 0.028245134024387306
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5916398713826366,
"acc_stderr": 0.027917050748484624,
"acc_norm": 0.5916398713826366,
"acc_norm_stderr": 0.027917050748484624
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.027002521034516468,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.027002521034516468
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.02927553215970473,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.02927553215970473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.40352020860495436,
"acc_stderr": 0.012530241301193186,
"acc_norm": 0.40352020860495436,
"acc_norm_stderr": 0.012530241301193186
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5257352941176471,
"acc_stderr": 0.030332578094555026,
"acc_norm": 0.5257352941176471,
"acc_norm_stderr": 0.030332578094555026
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5392156862745098,
"acc_stderr": 0.020165523313907904,
"acc_norm": 0.5392156862745098,
"acc_norm_stderr": 0.020165523313907904
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.636734693877551,
"acc_stderr": 0.030789051139030806,
"acc_norm": 0.636734693877551,
"acc_norm_stderr": 0.030789051139030806
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.03115715086935556,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.03115715086935556
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03377310252209205,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03377310252209205
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31211750305997554,
"mc1_stderr": 0.01622075676952093,
"mc2": 0.4514307596819309,
"mc2_stderr": 0.015825884159192616
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_xzuyn__LLaMa-2-LIMA-7B-QLoRA_v2 | 2023-08-29T20:44:42.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of xzuyn/LLaMa-2-LIMA-7B-QLoRA_v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [xzuyn/LLaMa-2-LIMA-7B-QLoRA_v2](https://huggingface.co/xzuyn/LLaMa-2-LIMA-7B-QLoRA_v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xzuyn__LLaMa-2-LIMA-7B-QLoRA_v2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T20:43:19.802214](https://huggingface.co/datasets/open-llm-leaderboard/details_xzuyn__LLaMa-2-LIMA-7B-QLoRA_v2/blob/main/results_2023-08-29T20%3A43%3A19.802214.json):\n\
\n```python\n{\n \"all\": {\n \"acc\": 0.42772147627937,\n \"acc_stderr\"\
: 0.0352621795234267,\n \"acc_norm\": 0.43164590880132125,\n \"acc_norm_stderr\"\
: 0.03524758282529255,\n \"mc1\": 0.28151774785801714,\n \"mc1_stderr\"\
: 0.01574402724825605,\n \"mc2\": 0.42597374324834275,\n \"mc2_stderr\"\
: 0.015781635062808143\n },\n \"harness|arc:challenge|25\": {\n \"\
acc\": 0.4880546075085324,\n \"acc_stderr\": 0.014607220340597171,\n \
\ \"acc_norm\": 0.5273037542662116,\n \"acc_norm_stderr\": 0.014589589101985994\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6005775741884087,\n\
\ \"acc_stderr\": 0.004887787255353494,\n \"acc_norm\": 0.7928699462258514,\n\
\ \"acc_norm_stderr\": 0.004044213304049376\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4276315789473684,\n \"acc_stderr\": 0.04026097083296558,\n\
\ \"acc_norm\": 0.4276315789473684,\n \"acc_norm_stderr\": 0.04026097083296558\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.44,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.41509433962264153,\n \"acc_stderr\": 0.030325945789286105,\n\
\ \"acc_norm\": 0.41509433962264153,\n \"acc_norm_stderr\": 0.030325945789286105\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4583333333333333,\n\
\ \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.4583333333333333,\n\
\ \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3468208092485549,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.3468208092485549,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39574468085106385,\n \"acc_stderr\": 0.03196758697835362,\n\
\ \"acc_norm\": 0.39574468085106385,\n \"acc_norm_stderr\": 0.03196758697835362\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.43448275862068964,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.43448275862068964,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2724867724867725,\n \"acc_stderr\": 0.02293097307163335,\n \"\
acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.02293097307163335\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n\
\ \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n\
\ \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.43548387096774194,\n \"acc_stderr\": 0.028206225591502737,\n \"\
acc_norm\": 0.43548387096774194,\n \"acc_norm_stderr\": 0.028206225591502737\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3645320197044335,\n \"acc_stderr\": 0.033864057460620905,\n \"\
acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.033864057460620905\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5454545454545454,\n \"acc_stderr\": 0.038881769216741004,\n\
\ \"acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.038881769216741004\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4898989898989899,\n \"acc_stderr\": 0.03561625488673745,\n \"\
acc_norm\": 0.4898989898989899,\n \"acc_norm_stderr\": 0.03561625488673745\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6424870466321243,\n \"acc_stderr\": 0.03458816042181012,\n\
\ \"acc_norm\": 0.6424870466321243,\n \"acc_norm_stderr\": 0.03458816042181012\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4128205128205128,\n \"acc_stderr\": 0.02496268356433182,\n \
\ \"acc_norm\": 0.4128205128205128,\n \"acc_norm_stderr\": 0.02496268356433182\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712166,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712166\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3907563025210084,\n \"acc_stderr\": 0.031693802357129965,\n\
\ \"acc_norm\": 0.3907563025210084,\n \"acc_norm_stderr\": 0.031693802357129965\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5559633027522936,\n \"acc_stderr\": 0.021302621211654518,\n \"\
acc_norm\": 0.5559633027522936,\n \"acc_norm_stderr\": 0.021302621211654518\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.21296296296296297,\n \"acc_stderr\": 0.027920963147993656,\n \"\
acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.027920963147993656\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4852941176470588,\n \"acc_stderr\": 0.03507793834791324,\n \"\
acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.03507793834791324\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5147679324894515,\n \"acc_stderr\": 0.032533028078777386,\n \
\ \"acc_norm\": 0.5147679324894515,\n \"acc_norm_stderr\": 0.032533028078777386\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4618834080717489,\n\
\ \"acc_stderr\": 0.033460150119732274,\n \"acc_norm\": 0.4618834080717489,\n\
\ \"acc_norm_stderr\": 0.033460150119732274\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.46564885496183206,\n \"acc_stderr\": 0.04374928560599738,\n\
\ \"acc_norm\": 0.46564885496183206,\n \"acc_norm_stderr\": 0.04374928560599738\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5206611570247934,\n \"acc_stderr\": 0.04560456086387235,\n \"\
acc_norm\": 0.5206611570247934,\n \"acc_norm_stderr\": 0.04560456086387235\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4537037037037037,\n\
\ \"acc_stderr\": 0.048129173245368216,\n \"acc_norm\": 0.4537037037037037,\n\
\ \"acc_norm_stderr\": 0.048129173245368216\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.39263803680981596,\n \"acc_stderr\": 0.03836740907831029,\n\
\ \"acc_norm\": 0.39263803680981596,\n \"acc_norm_stderr\": 0.03836740907831029\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.045218299028335865,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.045218299028335865\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5339805825242718,\n \"acc_stderr\": 0.0493929144727348,\n\
\ \"acc_norm\": 0.5339805825242718,\n \"acc_norm_stderr\": 0.0493929144727348\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.594017094017094,\n\
\ \"acc_stderr\": 0.03217180182641086,\n \"acc_norm\": 0.594017094017094,\n\
\ \"acc_norm_stderr\": 0.03217180182641086\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5696040868454662,\n\
\ \"acc_stderr\": 0.017705868776292395,\n \"acc_norm\": 0.5696040868454662,\n\
\ \"acc_norm_stderr\": 0.017705868776292395\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.45375722543352603,\n \"acc_stderr\": 0.026803720583206184,\n\
\ \"acc_norm\": 0.45375722543352603,\n \"acc_norm_stderr\": 0.026803720583206184\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27039106145251396,\n\
\ \"acc_stderr\": 0.014854993938010078,\n \"acc_norm\": 0.27039106145251396,\n\
\ \"acc_norm_stderr\": 0.014854993938010078\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.028431095444176643,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.028431095444176643\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5466237942122186,\n\
\ \"acc_stderr\": 0.02827435985489425,\n \"acc_norm\": 0.5466237942122186,\n\
\ \"acc_norm_stderr\": 0.02827435985489425\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5061728395061729,\n \"acc_stderr\": 0.027818623962583295,\n\
\ \"acc_norm\": 0.5061728395061729,\n \"acc_norm_stderr\": 0.027818623962583295\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.35815602836879434,\n \"acc_stderr\": 0.028602085862759422,\n \
\ \"acc_norm\": 0.35815602836879434,\n \"acc_norm_stderr\": 0.028602085862759422\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.30638852672750977,\n\
\ \"acc_stderr\": 0.011773980329380731,\n \"acc_norm\": 0.30638852672750977,\n\
\ \"acc_norm_stderr\": 0.011773980329380731\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.43014705882352944,\n \"acc_stderr\": 0.030074971917302875,\n\
\ \"acc_norm\": 0.43014705882352944,\n \"acc_norm_stderr\": 0.030074971917302875\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.42320261437908496,\n \"acc_stderr\": 0.019987809769482064,\n \
\ \"acc_norm\": 0.42320261437908496,\n \"acc_norm_stderr\": 0.019987809769482064\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4636363636363636,\n\
\ \"acc_stderr\": 0.047764491623961985,\n \"acc_norm\": 0.4636363636363636,\n\
\ \"acc_norm_stderr\": 0.047764491623961985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.37142857142857144,\n \"acc_stderr\": 0.03093285879278985,\n\
\ \"acc_norm\": 0.37142857142857144,\n \"acc_norm_stderr\": 0.03093285879278985\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.527363184079602,\n\
\ \"acc_stderr\": 0.03530235517334682,\n \"acc_norm\": 0.527363184079602,\n\
\ \"acc_norm_stderr\": 0.03530235517334682\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3313253012048193,\n\
\ \"acc_stderr\": 0.03664314777288085,\n \"acc_norm\": 0.3313253012048193,\n\
\ \"acc_norm_stderr\": 0.03664314777288085\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6608187134502924,\n \"acc_stderr\": 0.03631053496488905,\n\
\ \"acc_norm\": 0.6608187134502924,\n \"acc_norm_stderr\": 0.03631053496488905\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28151774785801714,\n\
\ \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.42597374324834275,\n\
\ \"mc2_stderr\": 0.015781635062808143\n }\n}\n```"
repo_url: https://huggingface.co/xzuyn/LLaMa-2-LIMA-7B-QLoRA_v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|arc:challenge|25_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hellaswag|10_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:43:19.802214.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:43:19.802214.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T20:43:19.802214.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T20:43:19.802214.parquet'
- config_name: results
data_files:
- split: 2023_08_29T20_43_19.802214
path:
- results_2023-08-29T20:43:19.802214.parquet
- split: latest
path:
- results_2023-08-29T20:43:19.802214.parquet
---
# Dataset Card for Evaluation run of xzuyn/LLaMa-2-LIMA-7B-QLoRA_v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/xzuyn/LLaMa-2-LIMA-7B-QLoRA_v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [xzuyn/LLaMa-2-LIMA-7B-QLoRA_v2](https://huggingface.co/xzuyn/LLaMa-2-LIMA-7B-QLoRA_v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xzuyn__LLaMa-2-LIMA-7B-QLoRA_v2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T20:43:19.802214](https://huggingface.co/datasets/open-llm-leaderboard/details_xzuyn__LLaMa-2-LIMA-7B-QLoRA_v2/blob/main/results_2023-08-29T20%3A43%3A19.802214.json):
```python
{
"all": {
"acc": 0.42772147627937,
"acc_stderr": 0.0352621795234267,
"acc_norm": 0.43164590880132125,
"acc_norm_stderr": 0.03524758282529255,
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.42597374324834275,
"mc2_stderr": 0.015781635062808143
},
"harness|arc:challenge|25": {
"acc": 0.4880546075085324,
"acc_stderr": 0.014607220340597171,
"acc_norm": 0.5273037542662116,
"acc_norm_stderr": 0.014589589101985994
},
"harness|hellaswag|10": {
"acc": 0.6005775741884087,
"acc_stderr": 0.004887787255353494,
"acc_norm": 0.7928699462258514,
"acc_norm_stderr": 0.004044213304049376
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4276315789473684,
"acc_stderr": 0.04026097083296558,
"acc_norm": 0.4276315789473684,
"acc_norm_stderr": 0.04026097083296558
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.41509433962264153,
"acc_stderr": 0.030325945789286105,
"acc_norm": 0.41509433962264153,
"acc_norm_stderr": 0.030325945789286105
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.04166666666666665,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.04166666666666665
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3468208092485549,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.3468208092485549,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39574468085106385,
"acc_stderr": 0.03196758697835362,
"acc_norm": 0.39574468085106385,
"acc_norm_stderr": 0.03196758697835362
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.43448275862068964,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.43448275862068964,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2724867724867725,
"acc_stderr": 0.02293097307163335,
"acc_norm": 0.2724867724867725,
"acc_norm_stderr": 0.02293097307163335
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.04104947269903394,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.04104947269903394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.43548387096774194,
"acc_stderr": 0.028206225591502737,
"acc_norm": 0.43548387096774194,
"acc_norm_stderr": 0.028206225591502737
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.033864057460620905,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.033864057460620905
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.038881769216741004,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.038881769216741004
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4898989898989899,
"acc_stderr": 0.03561625488673745,
"acc_norm": 0.4898989898989899,
"acc_norm_stderr": 0.03561625488673745
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6424870466321243,
"acc_stderr": 0.03458816042181012,
"acc_norm": 0.6424870466321243,
"acc_norm_stderr": 0.03458816042181012
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4128205128205128,
"acc_stderr": 0.02496268356433182,
"acc_norm": 0.4128205128205128,
"acc_norm_stderr": 0.02496268356433182
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712166,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712166
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3907563025210084,
"acc_stderr": 0.031693802357129965,
"acc_norm": 0.3907563025210084,
"acc_norm_stderr": 0.031693802357129965
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5559633027522936,
"acc_stderr": 0.021302621211654518,
"acc_norm": 0.5559633027522936,
"acc_norm_stderr": 0.021302621211654518
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.027920963147993656,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.027920963147993656
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.03507793834791324,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.03507793834791324
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5147679324894515,
"acc_stderr": 0.032533028078777386,
"acc_norm": 0.5147679324894515,
"acc_norm_stderr": 0.032533028078777386
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4618834080717489,
"acc_stderr": 0.033460150119732274,
"acc_norm": 0.4618834080717489,
"acc_norm_stderr": 0.033460150119732274
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.46564885496183206,
"acc_stderr": 0.04374928560599738,
"acc_norm": 0.46564885496183206,
"acc_norm_stderr": 0.04374928560599738
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5206611570247934,
"acc_stderr": 0.04560456086387235,
"acc_norm": 0.5206611570247934,
"acc_norm_stderr": 0.04560456086387235
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.048129173245368216,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.048129173245368216
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.39263803680981596,
"acc_stderr": 0.03836740907831029,
"acc_norm": 0.39263803680981596,
"acc_norm_stderr": 0.03836740907831029
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.045218299028335865,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.045218299028335865
},
"harness|hendrycksTest-management|5": {
"acc": 0.5339805825242718,
"acc_stderr": 0.0493929144727348,
"acc_norm": 0.5339805825242718,
"acc_norm_stderr": 0.0493929144727348
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.594017094017094,
"acc_stderr": 0.03217180182641086,
"acc_norm": 0.594017094017094,
"acc_norm_stderr": 0.03217180182641086
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5696040868454662,
"acc_stderr": 0.017705868776292395,
"acc_norm": 0.5696040868454662,
"acc_norm_stderr": 0.017705868776292395
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.45375722543352603,
"acc_stderr": 0.026803720583206184,
"acc_norm": 0.45375722543352603,
"acc_norm_stderr": 0.026803720583206184
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27039106145251396,
"acc_stderr": 0.014854993938010078,
"acc_norm": 0.27039106145251396,
"acc_norm_stderr": 0.014854993938010078
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.028431095444176643,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.028431095444176643
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5466237942122186,
"acc_stderr": 0.02827435985489425,
"acc_norm": 0.5466237942122186,
"acc_norm_stderr": 0.02827435985489425
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5061728395061729,
"acc_stderr": 0.027818623962583295,
"acc_norm": 0.5061728395061729,
"acc_norm_stderr": 0.027818623962583295
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35815602836879434,
"acc_stderr": 0.028602085862759422,
"acc_norm": 0.35815602836879434,
"acc_norm_stderr": 0.028602085862759422
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.30638852672750977,
"acc_stderr": 0.011773980329380731,
"acc_norm": 0.30638852672750977,
"acc_norm_stderr": 0.011773980329380731
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.43014705882352944,
"acc_stderr": 0.030074971917302875,
"acc_norm": 0.43014705882352944,
"acc_norm_stderr": 0.030074971917302875
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.42320261437908496,
"acc_stderr": 0.019987809769482064,
"acc_norm": 0.42320261437908496,
"acc_norm_stderr": 0.019987809769482064
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4636363636363636,
"acc_stderr": 0.047764491623961985,
"acc_norm": 0.4636363636363636,
"acc_norm_stderr": 0.047764491623961985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.37142857142857144,
"acc_stderr": 0.03093285879278985,
"acc_norm": 0.37142857142857144,
"acc_norm_stderr": 0.03093285879278985
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.527363184079602,
"acc_stderr": 0.03530235517334682,
"acc_norm": 0.527363184079602,
"acc_norm_stderr": 0.03530235517334682
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3313253012048193,
"acc_stderr": 0.03664314777288085,
"acc_norm": 0.3313253012048193,
"acc_norm_stderr": 0.03664314777288085
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6608187134502924,
"acc_stderr": 0.03631053496488905,
"acc_norm": 0.6608187134502924,
"acc_norm_stderr": 0.03631053496488905
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.42597374324834275,
"mc2_stderr": 0.015781635062808143
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-13b-V4 | 2023-08-29T20:46:40.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of The-Face-Of-Goonery/Huginn-13b-V4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [The-Face-Of-Goonery/Huginn-13b-V4](https://huggingface.co/The-Face-Of-Goonery/Huginn-13b-V4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-13b-V4\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T20:45:16.313049](https://huggingface.co/datasets/open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-13b-V4/blob/main/results_2023-08-29T20%3A45%3A16.313049.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5259069198696034,\n\
\ \"acc_stderr\": 0.034732780984597124,\n \"acc_norm\": 0.5297450802692631,\n\
\ \"acc_norm_stderr\": 0.03471265443522423,\n \"mc1\": 0.34394124847001223,\n\
\ \"mc1_stderr\": 0.01662908751427678,\n \"mc2\": 0.506195531543039,\n\
\ \"mc2_stderr\": 0.01543396728769934\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5750853242320819,\n \"acc_stderr\": 0.014445698968520769,\n\
\ \"acc_norm\": 0.606655290102389,\n \"acc_norm_stderr\": 0.014275101465693026\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6285600477992431,\n\
\ \"acc_stderr\": 0.004822022254886021,\n \"acc_norm\": 0.8234415455088627,\n\
\ \"acc_norm_stderr\": 0.0038051533447130874\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5328947368421053,\n \"acc_stderr\": 0.040601270352363966,\n\
\ \"acc_norm\": 0.5328947368421053,\n \"acc_norm_stderr\": 0.040601270352363966\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n\
\ \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5773584905660377,\n \"acc_stderr\": 0.03040233144576954,\n\
\ \"acc_norm\": 0.5773584905660377,\n \"acc_norm_stderr\": 0.03040233144576954\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5208333333333334,\n\
\ \"acc_stderr\": 0.04177578950739993,\n \"acc_norm\": 0.5208333333333334,\n\
\ \"acc_norm_stderr\": 0.04177578950739993\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.48554913294797686,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.48554913294797686,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006715,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006715\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523864,\n \"\
acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523864\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6419354838709678,\n\
\ \"acc_stderr\": 0.027273890594300645,\n \"acc_norm\": 0.6419354838709678,\n\
\ \"acc_norm_stderr\": 0.027273890594300645\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.37438423645320196,\n \"acc_stderr\": 0.03405155380561952,\n\
\ \"acc_norm\": 0.37438423645320196,\n \"acc_norm_stderr\": 0.03405155380561952\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.03697442205031596,\n\
\ \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.03697442205031596\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6767676767676768,\n \"acc_stderr\": 0.03332299921070644,\n \"\
acc_norm\": 0.6767676767676768,\n \"acc_norm_stderr\": 0.03332299921070644\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7564766839378239,\n \"acc_stderr\": 0.030975436386845436,\n\
\ \"acc_norm\": 0.7564766839378239,\n \"acc_norm_stderr\": 0.030975436386845436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5051282051282051,\n \"acc_stderr\": 0.025349672906838653,\n\
\ \"acc_norm\": 0.5051282051282051,\n \"acc_norm_stderr\": 0.025349672906838653\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.0279404571362284,\n \"acc_norm\":\
\ 0.3,\n \"acc_norm_stderr\": 0.0279404571362284\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.5210084033613446,\n \"acc_stderr\": 0.03244980849990029,\n\
\ \"acc_norm\": 0.5210084033613446,\n \"acc_norm_stderr\": 0.03244980849990029\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.710091743119266,\n\
\ \"acc_stderr\": 0.019453066609201597,\n \"acc_norm\": 0.710091743119266,\n\
\ \"acc_norm_stderr\": 0.019453066609201597\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643525,\n\
\ \"acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643525\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.696078431372549,\n \"acc_stderr\": 0.032282103870378914,\n \"\
acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.032282103870378914\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6877637130801688,\n \"acc_stderr\": 0.03016513786784701,\n \
\ \"acc_norm\": 0.6877637130801688,\n \"acc_norm_stderr\": 0.03016513786784701\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.03210062154134986,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.03210062154134986\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.042607351576445594,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.042607351576445594\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6859504132231405,\n \"acc_stderr\": 0.04236964753041019,\n \"\
acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.04236964753041019\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6574074074074074,\n\
\ \"acc_stderr\": 0.045879047413018105,\n \"acc_norm\": 0.6574074074074074,\n\
\ \"acc_norm_stderr\": 0.045879047413018105\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5950920245398773,\n \"acc_stderr\": 0.03856672163548913,\n\
\ \"acc_norm\": 0.5950920245398773,\n \"acc_norm_stderr\": 0.03856672163548913\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7948717948717948,\n\
\ \"acc_stderr\": 0.02645350805404033,\n \"acc_norm\": 0.7948717948717948,\n\
\ \"acc_norm_stderr\": 0.02645350805404033\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7075351213282248,\n\
\ \"acc_stderr\": 0.016267000684598642,\n \"acc_norm\": 0.7075351213282248,\n\
\ \"acc_norm_stderr\": 0.016267000684598642\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.02642481659400985,\n\
\ \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.02642481659400985\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574877,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574877\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5751633986928104,\n \"acc_stderr\": 0.028304576673141103,\n\
\ \"acc_norm\": 0.5751633986928104,\n \"acc_norm_stderr\": 0.028304576673141103\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5852090032154341,\n\
\ \"acc_stderr\": 0.027982680459759567,\n \"acc_norm\": 0.5852090032154341,\n\
\ \"acc_norm_stderr\": 0.027982680459759567\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.027648477877413327,\n\
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.027648477877413327\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.39361702127659576,\n \"acc_stderr\": 0.029144544781596147,\n \
\ \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.029144544781596147\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3970013037809648,\n\
\ \"acc_stderr\": 0.012496346982909554,\n \"acc_norm\": 0.3970013037809648,\n\
\ \"acc_norm_stderr\": 0.012496346982909554\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5073529411764706,\n \"acc_stderr\": 0.030369552523902173,\n\
\ \"acc_norm\": 0.5073529411764706,\n \"acc_norm_stderr\": 0.030369552523902173\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.47875816993464054,\n \"acc_stderr\": 0.020209572388600265,\n \
\ \"acc_norm\": 0.47875816993464054,\n \"acc_norm_stderr\": 0.020209572388600265\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.03100120903989484,\n\
\ \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.03100120903989484\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n\
\ \"acc_stderr\": 0.031871875379197966,\n \"acc_norm\": 0.7164179104477612,\n\
\ \"acc_norm_stderr\": 0.031871875379197966\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.038284011150790206,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.038284011150790206\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.03467826685703826,\n\
\ \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.03467826685703826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34394124847001223,\n\
\ \"mc1_stderr\": 0.01662908751427678,\n \"mc2\": 0.506195531543039,\n\
\ \"mc2_stderr\": 0.01543396728769934\n }\n}\n```"
repo_url: https://huggingface.co/The-Face-Of-Goonery/Huginn-13b-V4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|arc:challenge|25_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hellaswag|10_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:45:16.313049.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:45:16.313049.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T20:45:16.313049.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T20:45:16.313049.parquet'
- config_name: results
data_files:
- split: 2023_08_29T20_45_16.313049
path:
- results_2023-08-29T20:45:16.313049.parquet
- split: latest
path:
- results_2023-08-29T20:45:16.313049.parquet
---
# Dataset Card for Evaluation run of The-Face-Of-Goonery/Huginn-13b-V4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/The-Face-Of-Goonery/Huginn-13b-V4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [The-Face-Of-Goonery/Huginn-13b-V4](https://huggingface.co/The-Face-Of-Goonery/Huginn-13b-V4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-13b-V4",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T20:45:16.313049](https://huggingface.co/datasets/open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-13b-V4/blob/main/results_2023-08-29T20%3A45%3A16.313049.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5259069198696034,
"acc_stderr": 0.034732780984597124,
"acc_norm": 0.5297450802692631,
"acc_norm_stderr": 0.03471265443522423,
"mc1": 0.34394124847001223,
"mc1_stderr": 0.01662908751427678,
"mc2": 0.506195531543039,
"mc2_stderr": 0.01543396728769934
},
"harness|arc:challenge|25": {
"acc": 0.5750853242320819,
"acc_stderr": 0.014445698968520769,
"acc_norm": 0.606655290102389,
"acc_norm_stderr": 0.014275101465693026
},
"harness|hellaswag|10": {
"acc": 0.6285600477992431,
"acc_stderr": 0.004822022254886021,
"acc_norm": 0.8234415455088627,
"acc_norm_stderr": 0.0038051533447130874
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5328947368421053,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.5328947368421053,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5773584905660377,
"acc_stderr": 0.03040233144576954,
"acc_norm": 0.5773584905660377,
"acc_norm_stderr": 0.03040233144576954
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5208333333333334,
"acc_stderr": 0.04177578950739993,
"acc_norm": 0.5208333333333334,
"acc_norm_stderr": 0.04177578950739993
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.48554913294797686,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.48554913294797686,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006715,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006715
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.43829787234042555,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.43829787234042555,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.023809523809523864,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.023809523809523864
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6419354838709678,
"acc_stderr": 0.027273890594300645,
"acc_norm": 0.6419354838709678,
"acc_norm_stderr": 0.027273890594300645
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.37438423645320196,
"acc_stderr": 0.03405155380561952,
"acc_norm": 0.37438423645320196,
"acc_norm_stderr": 0.03405155380561952
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6606060606060606,
"acc_stderr": 0.03697442205031596,
"acc_norm": 0.6606060606060606,
"acc_norm_stderr": 0.03697442205031596
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6767676767676768,
"acc_stderr": 0.03332299921070644,
"acc_norm": 0.6767676767676768,
"acc_norm_stderr": 0.03332299921070644
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7564766839378239,
"acc_stderr": 0.030975436386845436,
"acc_norm": 0.7564766839378239,
"acc_norm_stderr": 0.030975436386845436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5051282051282051,
"acc_stderr": 0.025349672906838653,
"acc_norm": 0.5051282051282051,
"acc_norm_stderr": 0.025349672906838653
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.0279404571362284,
"acc_norm": 0.3,
"acc_norm_stderr": 0.0279404571362284
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5210084033613446,
"acc_stderr": 0.03244980849990029,
"acc_norm": 0.5210084033613446,
"acc_norm_stderr": 0.03244980849990029
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.710091743119266,
"acc_stderr": 0.019453066609201597,
"acc_norm": 0.710091743119266,
"acc_norm_stderr": 0.019453066609201597
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643525,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643525
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.032282103870378914,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.032282103870378914
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6877637130801688,
"acc_stderr": 0.03016513786784701,
"acc_norm": 0.6877637130801688,
"acc_norm_stderr": 0.03016513786784701
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.03210062154134986,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.03210062154134986
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.042607351576445594,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.042607351576445594
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6859504132231405,
"acc_stderr": 0.04236964753041019,
"acc_norm": 0.6859504132231405,
"acc_norm_stderr": 0.04236964753041019
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.045879047413018105,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.045879047413018105
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5950920245398773,
"acc_stderr": 0.03856672163548913,
"acc_norm": 0.5950920245398773,
"acc_norm_stderr": 0.03856672163548913
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976256,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976256
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7948717948717948,
"acc_stderr": 0.02645350805404033,
"acc_norm": 0.7948717948717948,
"acc_norm_stderr": 0.02645350805404033
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7075351213282248,
"acc_stderr": 0.016267000684598642,
"acc_norm": 0.7075351213282248,
"acc_norm_stderr": 0.016267000684598642
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.02642481659400985,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.02642481659400985
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574877,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574877
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5751633986928104,
"acc_stderr": 0.028304576673141103,
"acc_norm": 0.5751633986928104,
"acc_norm_stderr": 0.028304576673141103
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5852090032154341,
"acc_stderr": 0.027982680459759567,
"acc_norm": 0.5852090032154341,
"acc_norm_stderr": 0.027982680459759567
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.027648477877413327,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.027648477877413327
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.39361702127659576,
"acc_stderr": 0.029144544781596147,
"acc_norm": 0.39361702127659576,
"acc_norm_stderr": 0.029144544781596147
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3970013037809648,
"acc_stderr": 0.012496346982909554,
"acc_norm": 0.3970013037809648,
"acc_norm_stderr": 0.012496346982909554
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5073529411764706,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.5073529411764706,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.47875816993464054,
"acc_stderr": 0.020209572388600265,
"acc_norm": 0.47875816993464054,
"acc_norm_stderr": 0.020209572388600265
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.031871875379197966,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.031871875379197966
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.038284011150790206,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.038284011150790206
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7134502923976608,
"acc_stderr": 0.03467826685703826,
"acc_norm": 0.7134502923976608,
"acc_norm_stderr": 0.03467826685703826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34394124847001223,
"mc1_stderr": 0.01662908751427678,
"mc2": 0.506195531543039,
"mc2_stderr": 0.01543396728769934
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-OpenOrca_5w | 2023-08-29T20:47:38.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-OpenOrca_5w
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-OpenOrca_5w](https://huggingface.co/CHIH-HUNG/llama-2-13b-OpenOrca_5w)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-OpenOrca_5w\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T20:46:12.549567](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-OpenOrca_5w/blob/main/results_2023-08-29T20%3A46%3A12.549567.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5621049855400969,\n\
\ \"acc_stderr\": 0.03417588893701613,\n \"acc_norm\": 0.5662577729070264,\n\
\ \"acc_norm_stderr\": 0.03415418968434573,\n \"mc1\": 0.31701346389228885,\n\
\ \"mc1_stderr\": 0.01628920337440338,\n \"mc2\": 0.4487000954431151,\n\
\ \"mc2_stderr\": 0.01484191682851324\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5733788395904437,\n \"acc_stderr\": 0.014453185592920293,\n\
\ \"acc_norm\": 0.6100682593856656,\n \"acc_norm_stderr\": 0.014252959848892889\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.619896434973113,\n\
\ \"acc_stderr\": 0.004844199910173035,\n \"acc_norm\": 0.8282214698267277,\n\
\ \"acc_norm_stderr\": 0.0037641697466461736\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.040335656678483184,\n\
\ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.040335656678483184\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.630188679245283,\n \"acc_stderr\": 0.029711421880107933,\n\
\ \"acc_norm\": 0.630188679245283,\n \"acc_norm_stderr\": 0.029711421880107933\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5694444444444444,\n\
\ \"acc_stderr\": 0.04140685639111503,\n \"acc_norm\": 0.5694444444444444,\n\
\ \"acc_norm_stderr\": 0.04140685639111503\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n\
\ \"acc_stderr\": 0.038073017265045125,\n \"acc_norm\": 0.5260115606936416,\n\
\ \"acc_norm_stderr\": 0.038073017265045125\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.042801058373643966,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.042801058373643966\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.032025630761017346,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.032025630761017346\n \
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3412698412698413,\n \"acc_stderr\": 0.02441923496681906,\n \"\
acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.02441923496681906\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.042857142857142816,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.042857142857142816\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6516129032258065,\n\
\ \"acc_stderr\": 0.027104826328100944,\n \"acc_norm\": 0.6516129032258065,\n\
\ \"acc_norm_stderr\": 0.027104826328100944\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7070707070707071,\n \"acc_stderr\": 0.032424979581788166,\n \"\
acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.032424979581788166\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.02717121368316455,\n\
\ \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.02717121368316455\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.49230769230769234,\n \"acc_stderr\": 0.025348006031534778,\n\
\ \"acc_norm\": 0.49230769230769234,\n \"acc_norm_stderr\": 0.025348006031534778\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608463,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608463\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5756302521008403,\n \"acc_stderr\": 0.032104790510157764,\n\
\ \"acc_norm\": 0.5756302521008403,\n \"acc_norm_stderr\": 0.032104790510157764\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7412844036697248,\n \"acc_stderr\": 0.018776052319619624,\n \"\
acc_norm\": 0.7412844036697248,\n \"acc_norm_stderr\": 0.018776052319619624\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538271,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538271\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591362,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591362\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069422,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069422\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908706,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908706\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.04373313040914761,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.04373313040914761\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.03642914578292406,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.03642914578292406\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.026246772946890477,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.026246772946890477\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7573435504469987,\n\
\ \"acc_stderr\": 0.015329888940899858,\n \"acc_norm\": 0.7573435504469987,\n\
\ \"acc_norm_stderr\": 0.015329888940899858\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.025361168749688218,\n\
\ \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.025361168749688218\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3664804469273743,\n\
\ \"acc_stderr\": 0.01611523550486548,\n \"acc_norm\": 0.3664804469273743,\n\
\ \"acc_norm_stderr\": 0.01611523550486548\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.02763417668960266,\n\
\ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.02763417668960266\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n\
\ \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n\
\ \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.026725868809100793,\n\
\ \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.026725868809100793\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41843971631205673,\n \"acc_stderr\": 0.02942799403941999,\n \
\ \"acc_norm\": 0.41843971631205673,\n \"acc_norm_stderr\": 0.02942799403941999\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4165580182529335,\n\
\ \"acc_stderr\": 0.012591153245057388,\n \"acc_norm\": 0.4165580182529335,\n\
\ \"acc_norm_stderr\": 0.012591153245057388\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5073529411764706,\n \"acc_stderr\": 0.030369552523902173,\n\
\ \"acc_norm\": 0.5073529411764706,\n \"acc_norm_stderr\": 0.030369552523902173\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5555555555555556,\n \"acc_stderr\": 0.020102583895887184,\n \
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.020102583895887184\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6571428571428571,\n \"acc_stderr\": 0.030387262919547724,\n\
\ \"acc_norm\": 0.6571428571428571,\n \"acc_norm_stderr\": 0.030387262919547724\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n\
\ \"acc_stderr\": 0.03096590312357302,\n \"acc_norm\": 0.7412935323383084,\n\
\ \"acc_norm_stderr\": 0.03096590312357302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31701346389228885,\n\
\ \"mc1_stderr\": 0.01628920337440338,\n \"mc2\": 0.4487000954431151,\n\
\ \"mc2_stderr\": 0.01484191682851324\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-OpenOrca_5w
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|arc:challenge|25_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hellaswag|10_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:46:12.549567.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:46:12.549567.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T20:46:12.549567.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T20:46:12.549567.parquet'
- config_name: results
data_files:
- split: 2023_08_29T20_46_12.549567
path:
- results_2023-08-29T20:46:12.549567.parquet
- split: latest
path:
- results_2023-08-29T20:46:12.549567.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-OpenOrca_5w
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-OpenOrca_5w
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-OpenOrca_5w](https://huggingface.co/CHIH-HUNG/llama-2-13b-OpenOrca_5w) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-OpenOrca_5w",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T20:46:12.549567](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-OpenOrca_5w/blob/main/results_2023-08-29T20%3A46%3A12.549567.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5621049855400969,
"acc_stderr": 0.03417588893701613,
"acc_norm": 0.5662577729070264,
"acc_norm_stderr": 0.03415418968434573,
"mc1": 0.31701346389228885,
"mc1_stderr": 0.01628920337440338,
"mc2": 0.4487000954431151,
"mc2_stderr": 0.01484191682851324
},
"harness|arc:challenge|25": {
"acc": 0.5733788395904437,
"acc_stderr": 0.014453185592920293,
"acc_norm": 0.6100682593856656,
"acc_norm_stderr": 0.014252959848892889
},
"harness|hellaswag|10": {
"acc": 0.619896434973113,
"acc_stderr": 0.004844199910173035,
"acc_norm": 0.8282214698267277,
"acc_norm_stderr": 0.0037641697466461736
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.040335656678483184,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.040335656678483184
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.630188679245283,
"acc_stderr": 0.029711421880107933,
"acc_norm": 0.630188679245283,
"acc_norm_stderr": 0.029711421880107933
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.04140685639111503,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.04140685639111503
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.038073017265045125,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.038073017265045125
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.042801058373643966,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.042801058373643966
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4,
"acc_stderr": 0.032025630761017346,
"acc_norm": 0.4,
"acc_norm_stderr": 0.032025630761017346
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.02441923496681906,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.02441923496681906
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.042857142857142816,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.042857142857142816
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6516129032258065,
"acc_stderr": 0.027104826328100944,
"acc_norm": 0.6516129032258065,
"acc_norm_stderr": 0.027104826328100944
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7070707070707071,
"acc_stderr": 0.032424979581788166,
"acc_norm": 0.7070707070707071,
"acc_norm_stderr": 0.032424979581788166
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.02717121368316455,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.02717121368316455
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.49230769230769234,
"acc_stderr": 0.025348006031534778,
"acc_norm": 0.49230769230769234,
"acc_norm_stderr": 0.025348006031534778
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608463,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608463
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5756302521008403,
"acc_stderr": 0.032104790510157764,
"acc_norm": 0.5756302521008403,
"acc_norm_stderr": 0.032104790510157764
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7412844036697248,
"acc_stderr": 0.018776052319619624,
"acc_norm": 0.7412844036697248,
"acc_norm_stderr": 0.018776052319619624
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069422,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069422
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908706,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908706
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.04373313040914761,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.04373313040914761
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.03642914578292406,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.03642914578292406
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976256,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976256
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.026246772946890477,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.026246772946890477
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7573435504469987,
"acc_stderr": 0.015329888940899858,
"acc_norm": 0.7573435504469987,
"acc_norm_stderr": 0.015329888940899858
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.025361168749688218,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.025361168749688218
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3664804469273743,
"acc_stderr": 0.01611523550486548,
"acc_norm": 0.3664804469273743,
"acc_norm_stderr": 0.01611523550486548
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.02763417668960266,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.02763417668960266
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6527331189710611,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.6527331189710611,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.026725868809100793,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.026725868809100793
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41843971631205673,
"acc_stderr": 0.02942799403941999,
"acc_norm": 0.41843971631205673,
"acc_norm_stderr": 0.02942799403941999
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4165580182529335,
"acc_stderr": 0.012591153245057388,
"acc_norm": 0.4165580182529335,
"acc_norm_stderr": 0.012591153245057388
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5073529411764706,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.5073529411764706,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.020102583895887184,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.020102583895887184
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6571428571428571,
"acc_stderr": 0.030387262919547724,
"acc_norm": 0.6571428571428571,
"acc_norm_stderr": 0.030387262919547724
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.03096590312357302,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.03096590312357302
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31701346389228885,
"mc1_stderr": 0.01628920337440338,
"mc2": 0.4487000954431151,
"mc2_stderr": 0.01484191682851324
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-alpaca-test | 2023-08-29T20:51:09.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-alpaca-test
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-alpaca-test](https://huggingface.co/CHIH-HUNG/llama-2-13b-alpaca-test)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-alpaca-test\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T20:49:48.067362](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-alpaca-test/blob/main/results_2023-08-29T20%3A49%3A48.067362.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5568910569830451,\n\
\ \"acc_stderr\": 0.03436225133323378,\n \"acc_norm\": 0.5610380147243772,\n\
\ \"acc_norm_stderr\": 0.034342335699213765,\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766372,\n \"mc2\": 0.3693612523342933,\n\
\ \"mc2_stderr\": 0.014364347604420232\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5571672354948806,\n \"acc_stderr\": 0.014515573873348899,\n\
\ \"acc_norm\": 0.6006825938566553,\n \"acc_norm_stderr\": 0.014312094557946704\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6117307309300936,\n\
\ \"acc_stderr\": 0.004863603638367449,\n \"acc_norm\": 0.8128858793069109,\n\
\ \"acc_norm_stderr\": 0.003892060546588329\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.040335656678483184,\n\
\ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.040335656678483184\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5625,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.5625,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n\
\ \"acc_stderr\": 0.03807301726504511,\n \"acc_norm\": 0.5260115606936416,\n\
\ \"acc_norm_stderr\": 0.03807301726504511\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.032400380867927465,\n\
\ \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.032400380867927465\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3306878306878307,\n \"acc_stderr\": 0.024229965298425072,\n \"\
acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.024229965298425072\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6580645161290323,\n\
\ \"acc_stderr\": 0.026985289576552742,\n \"acc_norm\": 0.6580645161290323,\n\
\ \"acc_norm_stderr\": 0.026985289576552742\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03681050869161551,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03681050869161551\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.03258630383836557,\n \"acc_norm\"\
: 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836557\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.02749350424454806,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.02749350424454806\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5153846153846153,\n \"acc_stderr\": 0.025339003010106522,\n\
\ \"acc_norm\": 0.5153846153846153,\n \"acc_norm_stderr\": 0.025339003010106522\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.032145368597886394,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.032145368597886394\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7376146788990826,\n \"acc_stderr\": 0.018861885021534738,\n \"\
acc_norm\": 0.7376146788990826,\n \"acc_norm_stderr\": 0.018861885021534738\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608044,\n \"\
acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608044\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7352941176470589,\n \"acc_stderr\": 0.030964517926923403,\n \"\
acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.030964517926923403\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094632,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094632\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.036230899157241446,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.036230899157241446\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n\
\ \"acc_stderr\": 0.025819233256483717,\n \"acc_norm\": 0.8076923076923077,\n\
\ \"acc_norm_stderr\": 0.025819233256483717\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7611749680715197,\n\
\ \"acc_stderr\": 0.015246803197398674,\n \"acc_norm\": 0.7611749680715197,\n\
\ \"acc_norm_stderr\": 0.015246803197398674\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6445086705202312,\n \"acc_stderr\": 0.025770292082977254,\n\
\ \"acc_norm\": 0.6445086705202312,\n \"acc_norm_stderr\": 0.025770292082977254\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.014893391735249619,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.014893391735249619\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.02799672318063145,\n\
\ \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.02799672318063145\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6463022508038585,\n\
\ \"acc_stderr\": 0.027155208103200865,\n \"acc_norm\": 0.6463022508038585,\n\
\ \"acc_norm_stderr\": 0.027155208103200865\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6203703703703703,\n \"acc_stderr\": 0.02700252103451647,\n\
\ \"acc_norm\": 0.6203703703703703,\n \"acc_norm_stderr\": 0.02700252103451647\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4078014184397163,\n \"acc_stderr\": 0.029316011776343555,\n \
\ \"acc_norm\": 0.4078014184397163,\n \"acc_norm_stderr\": 0.029316011776343555\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4178617992177314,\n\
\ \"acc_stderr\": 0.012596744108998557,\n \"acc_norm\": 0.4178617992177314,\n\
\ \"acc_norm_stderr\": 0.012596744108998557\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4889705882352941,\n \"acc_stderr\": 0.03036544647727568,\n\
\ \"acc_norm\": 0.4889705882352941,\n \"acc_norm_stderr\": 0.03036544647727568\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.565359477124183,\n \"acc_stderr\": 0.02005426920072646,\n \
\ \"acc_norm\": 0.565359477124183,\n \"acc_norm_stderr\": 0.02005426920072646\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.04673752333670239,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.04673752333670239\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6122448979591837,\n \"acc_stderr\": 0.031192230726795656,\n\
\ \"acc_norm\": 0.6122448979591837,\n \"acc_norm_stderr\": 0.031192230726795656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n\
\ \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.7512437810945274,\n\
\ \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766372,\n \"mc2\": 0.3693612523342933,\n\
\ \"mc2_stderr\": 0.014364347604420232\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-alpaca-test
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|arc:challenge|25_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hellaswag|10_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T20:49:48.067362.parquet'
- config_name: results
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- results_2023-08-29T20:49:48.067362.parquet
- split: latest
path:
- results_2023-08-29T20:49:48.067362.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-alpaca-test
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-alpaca-test
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-alpaca-test](https://huggingface.co/CHIH-HUNG/llama-2-13b-alpaca-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-alpaca-test",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T20:49:48.067362](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-alpaca-test/blob/main/results_2023-08-29T20%3A49%3A48.067362.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5568910569830451,
"acc_stderr": 0.03436225133323378,
"acc_norm": 0.5610380147243772,
"acc_norm_stderr": 0.034342335699213765,
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766372,
"mc2": 0.3693612523342933,
"mc2_stderr": 0.014364347604420232
},
"harness|arc:challenge|25": {
"acc": 0.5571672354948806,
"acc_stderr": 0.014515573873348899,
"acc_norm": 0.6006825938566553,
"acc_norm_stderr": 0.014312094557946704
},
"harness|hellaswag|10": {
"acc": 0.6117307309300936,
"acc_stderr": 0.004863603638367449,
"acc_norm": 0.8128858793069109,
"acc_norm_stderr": 0.003892060546588329
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.040335656678483184,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.040335656678483184
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5625,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.03807301726504511,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.03807301726504511
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4340425531914894,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.4340425531914894,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.024229965298425072,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.024229965298425072
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574924,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574924
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6580645161290323,
"acc_stderr": 0.026985289576552742,
"acc_norm": 0.6580645161290323,
"acc_norm_stderr": 0.026985289576552742
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03681050869161551,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03681050869161551
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836557,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836557
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.02749350424454806,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.02749350424454806
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5153846153846153,
"acc_stderr": 0.025339003010106522,
"acc_norm": 0.5153846153846153,
"acc_norm_stderr": 0.025339003010106522
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.032145368597886394,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.032145368597886394
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7376146788990826,
"acc_stderr": 0.018861885021534738,
"acc_norm": 0.7376146788990826,
"acc_norm_stderr": 0.018861885021534738
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03362277436608044,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03362277436608044
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.030964517926923403,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.030964517926923403
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094632,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094632
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.036230899157241446,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.036230899157241446
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.025819233256483717,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.025819233256483717
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7611749680715197,
"acc_stderr": 0.015246803197398674,
"acc_norm": 0.7611749680715197,
"acc_norm_stderr": 0.015246803197398674
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6445086705202312,
"acc_stderr": 0.025770292082977254,
"acc_norm": 0.6445086705202312,
"acc_norm_stderr": 0.025770292082977254
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249619,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249619
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.02799672318063145,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.02799672318063145
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6463022508038585,
"acc_stderr": 0.027155208103200865,
"acc_norm": 0.6463022508038585,
"acc_norm_stderr": 0.027155208103200865
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.02700252103451647,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.02700252103451647
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4078014184397163,
"acc_stderr": 0.029316011776343555,
"acc_norm": 0.4078014184397163,
"acc_norm_stderr": 0.029316011776343555
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4178617992177314,
"acc_stderr": 0.012596744108998557,
"acc_norm": 0.4178617992177314,
"acc_norm_stderr": 0.012596744108998557
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4889705882352941,
"acc_stderr": 0.03036544647727568,
"acc_norm": 0.4889705882352941,
"acc_norm_stderr": 0.03036544647727568
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.565359477124183,
"acc_stderr": 0.02005426920072646,
"acc_norm": 0.565359477124183,
"acc_norm_stderr": 0.02005426920072646
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670239,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670239
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6122448979591837,
"acc_stderr": 0.031192230726795656,
"acc_norm": 0.6122448979591837,
"acc_norm_stderr": 0.031192230726795656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916714,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916714
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766372,
"mc2": 0.3693612523342933,
"mc2_stderr": 0.014364347604420232
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Sao10K__Mythical-Destroyer-V2-L2-13B | 2023-09-17T00:31:52.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Sao10K/Mythical-Destroyer-V2-L2-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/Mythical-Destroyer-V2-L2-13B](https://huggingface.co/Sao10K/Mythical-Destroyer-V2-L2-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Mythical-Destroyer-V2-L2-13B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T00:31:39.922453](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Mythical-Destroyer-V2-L2-13B/blob/main/results_2023-09-17T00-31-39.922453.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.004089765100671141,\n\
\ \"em_stderr\": 0.0006535802669912855,\n \"f1\": 0.10332739093959775,\n\
\ \"f1_stderr\": 0.00186450066100098,\n \"acc\": 0.3737174427782163,\n\
\ \"acc_stderr\": 0.0061055742246970525\n },\n \"harness|drop|3\":\
\ {\n \"em\": 0.004089765100671141,\n \"em_stderr\": 0.0006535802669912855,\n\
\ \"f1\": 0.10332739093959775,\n \"f1_stderr\": 0.00186450066100098\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7474348855564326,\n\
\ \"acc_stderr\": 0.012211148449394105\n }\n}\n```"
repo_url: https://huggingface.co/Sao10K/Mythical-Destroyer-V2-L2-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|arc:challenge|25_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T00_31_39.922453
path:
- '**/details_harness|drop|3_2023-09-17T00-31-39.922453.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T00-31-39.922453.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T00_31_39.922453
path:
- '**/details_harness|gsm8k|5_2023-09-17T00-31-39.922453.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T00-31-39.922453.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hellaswag|10_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:53:13.636307.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T20:53:13.636307.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T20:53:13.636307.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T00_31_39.922453
path:
- '**/details_harness|winogrande|5_2023-09-17T00-31-39.922453.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T00-31-39.922453.parquet'
- config_name: results
data_files:
- split: 2023_08_29T20_53_13.636307
path:
- results_2023-08-29T20:53:13.636307.parquet
- split: 2023_09_17T00_31_39.922453
path:
- results_2023-09-17T00-31-39.922453.parquet
- split: latest
path:
- results_2023-09-17T00-31-39.922453.parquet
---
# Dataset Card for Evaluation run of Sao10K/Mythical-Destroyer-V2-L2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Sao10K/Mythical-Destroyer-V2-L2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Sao10K/Mythical-Destroyer-V2-L2-13B](https://huggingface.co/Sao10K/Mythical-Destroyer-V2-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Mythical-Destroyer-V2-L2-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T00:31:39.922453](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Mythical-Destroyer-V2-L2-13B/blob/main/results_2023-09-17T00-31-39.922453.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.004089765100671141,
"em_stderr": 0.0006535802669912855,
"f1": 0.10332739093959775,
"f1_stderr": 0.00186450066100098,
"acc": 0.3737174427782163,
"acc_stderr": 0.0061055742246970525
},
"harness|drop|3": {
"em": 0.004089765100671141,
"em_stderr": 0.0006535802669912855,
"f1": 0.10332739093959775,
"f1_stderr": 0.00186450066100098
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.7474348855564326,
"acc_stderr": 0.012211148449394105
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_TheBloke__Manticore-13B-Chat-Pyg-Guanaco-SuperHOT-8K-GPTQ | 2023-08-29T20:56:28.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheBloke/Manticore-13B-Chat-Pyg-Guanaco-SuperHOT-8K-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/Manticore-13B-Chat-Pyg-Guanaco-SuperHOT-8K-GPTQ](https://huggingface.co/TheBloke/Manticore-13B-Chat-Pyg-Guanaco-SuperHOT-8K-GPTQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Manticore-13B-Chat-Pyg-Guanaco-SuperHOT-8K-GPTQ\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T20:55:05.081055](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Manticore-13B-Chat-Pyg-Guanaco-SuperHOT-8K-GPTQ/blob/main/results_2023-08-29T20%3A55%3A05.081055.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4034689621810347,\n\
\ \"acc_stderr\": 0.03467877900913612,\n \"acc_norm\": 0.4072485843837375,\n\
\ \"acc_norm_stderr\": 0.03466353815247821,\n \"mc1\": 0.35006119951040393,\n\
\ \"mc1_stderr\": 0.016697949420151032,\n \"mc2\": 0.5254779040118085,\n\
\ \"mc2_stderr\": 0.01594442535756773\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5051194539249146,\n \"acc_stderr\": 0.014610624890309157,\n\
\ \"acc_norm\": 0.5281569965870307,\n \"acc_norm_stderr\": 0.014588204105102203\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5962955586536547,\n\
\ \"acc_stderr\": 0.00489636818576524,\n \"acc_norm\": 0.7962557259510058,\n\
\ \"acc_norm_stderr\": 0.004019578428155064\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37037037037037035,\n\
\ \"acc_stderr\": 0.04171654161354543,\n \"acc_norm\": 0.37037037037037035,\n\
\ \"acc_norm_stderr\": 0.04171654161354543\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.040179012759817494,\n\
\ \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.040179012759817494\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4226415094339623,\n \"acc_stderr\": 0.03040233144576954,\n\
\ \"acc_norm\": 0.4226415094339623,\n \"acc_norm_stderr\": 0.03040233144576954\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4305555555555556,\n\
\ \"acc_stderr\": 0.04140685639111503,\n \"acc_norm\": 0.4305555555555556,\n\
\ \"acc_norm_stderr\": 0.04140685639111503\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.32947976878612717,\n\
\ \"acc_stderr\": 0.03583901754736413,\n \"acc_norm\": 0.32947976878612717,\n\
\ \"acc_norm_stderr\": 0.03583901754736413\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3404255319148936,\n \"acc_stderr\": 0.03097669299853442,\n\
\ \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.03097669299853442\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.04372748290278006,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.04372748290278006\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.33793103448275863,\n \"acc_stderr\": 0.039417076320648906,\n\
\ \"acc_norm\": 0.33793103448275863,\n \"acc_norm_stderr\": 0.039417076320648906\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918417,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918417\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.0393253768039287,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.0393253768039287\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.42258064516129035,\n \"acc_stderr\": 0.02810096472427264,\n \"\
acc_norm\": 0.42258064516129035,\n \"acc_norm_stderr\": 0.02810096472427264\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.30049261083743845,\n \"acc_stderr\": 0.032257994762334846,\n \"\
acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.032257994762334846\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.494949494949495,\n \"acc_stderr\": 0.035621707606254015,\n \"\
acc_norm\": 0.494949494949495,\n \"acc_norm_stderr\": 0.035621707606254015\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5751295336787565,\n \"acc_stderr\": 0.035674713352125395,\n\
\ \"acc_norm\": 0.5751295336787565,\n \"acc_norm_stderr\": 0.035674713352125395\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3871794871794872,\n \"acc_stderr\": 0.02469721693087894,\n \
\ \"acc_norm\": 0.3871794871794872,\n \"acc_norm_stderr\": 0.02469721693087894\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.36134453781512604,\n \"acc_stderr\": 0.03120469122515002,\n\
\ \"acc_norm\": 0.36134453781512604,\n \"acc_norm_stderr\": 0.03120469122515002\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23841059602649006,\n \"acc_stderr\": 0.03479185572599661,\n \"\
acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.03479185572599661\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5302752293577981,\n \"acc_stderr\": 0.021397988604936965,\n \"\
acc_norm\": 0.5302752293577981,\n \"acc_norm_stderr\": 0.021397988604936965\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.27314814814814814,\n \"acc_stderr\": 0.03038805130167812,\n \"\
acc_norm\": 0.27314814814814814,\n \"acc_norm_stderr\": 0.03038805130167812\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25980392156862747,\n \"acc_stderr\": 0.030778554678693264,\n \"\
acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.030778554678693264\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5611814345991561,\n \"acc_stderr\": 0.032302649315470375,\n \
\ \"acc_norm\": 0.5611814345991561,\n \"acc_norm_stderr\": 0.032302649315470375\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5381165919282511,\n\
\ \"acc_stderr\": 0.033460150119732274,\n \"acc_norm\": 0.5381165919282511,\n\
\ \"acc_norm_stderr\": 0.033460150119732274\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4580152671755725,\n \"acc_stderr\": 0.04369802690578757,\n\
\ \"acc_norm\": 0.4580152671755725,\n \"acc_norm_stderr\": 0.04369802690578757\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.48760330578512395,\n \"acc_stderr\": 0.04562951548180765,\n \"\
acc_norm\": 0.48760330578512395,\n \"acc_norm_stderr\": 0.04562951548180765\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.44785276073619634,\n \"acc_stderr\": 0.03906947479456602,\n\
\ \"acc_norm\": 0.44785276073619634,\n \"acc_norm_stderr\": 0.03906947479456602\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467762,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467762\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.49514563106796117,\n \"acc_stderr\": 0.049505043821289195,\n\
\ \"acc_norm\": 0.49514563106796117,\n \"acc_norm_stderr\": 0.049505043821289195\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6752136752136753,\n\
\ \"acc_stderr\": 0.03067902276549883,\n \"acc_norm\": 0.6752136752136753,\n\
\ \"acc_norm_stderr\": 0.03067902276549883\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5708812260536399,\n\
\ \"acc_stderr\": 0.01769938848312679,\n \"acc_norm\": 0.5708812260536399,\n\
\ \"acc_norm_stderr\": 0.01769938848312679\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.43352601156069365,\n \"acc_stderr\": 0.026680134761679217,\n\
\ \"acc_norm\": 0.43352601156069365,\n \"acc_norm_stderr\": 0.026680134761679217\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2536312849162011,\n\
\ \"acc_stderr\": 0.014551553659369922,\n \"acc_norm\": 0.2536312849162011,\n\
\ \"acc_norm_stderr\": 0.014551553659369922\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.028074158947600653,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.028074158947600653\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.45980707395498394,\n\
\ \"acc_stderr\": 0.028306190403305693,\n \"acc_norm\": 0.45980707395498394,\n\
\ \"acc_norm_stderr\": 0.028306190403305693\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4660493827160494,\n \"acc_stderr\": 0.02775653525734767,\n\
\ \"acc_norm\": 0.4660493827160494,\n \"acc_norm_stderr\": 0.02775653525734767\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.32269503546099293,\n \"acc_stderr\": 0.02788913930053479,\n \
\ \"acc_norm\": 0.32269503546099293,\n \"acc_norm_stderr\": 0.02788913930053479\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35658409387222945,\n\
\ \"acc_stderr\": 0.012233642989273891,\n \"acc_norm\": 0.35658409387222945,\n\
\ \"acc_norm_stderr\": 0.012233642989273891\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.39338235294117646,\n \"acc_stderr\": 0.02967428828131118,\n\
\ \"acc_norm\": 0.39338235294117646,\n \"acc_norm_stderr\": 0.02967428828131118\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4150326797385621,\n \"acc_stderr\": 0.019933627776857425,\n \
\ \"acc_norm\": 0.4150326797385621,\n \"acc_norm_stderr\": 0.019933627776857425\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.44545454545454544,\n\
\ \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.44545454545454544,\n\
\ \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.031680911612338825,\n\
\ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.031680911612338825\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5024875621890548,\n\
\ \"acc_stderr\": 0.03535490150137289,\n \"acc_norm\": 0.5024875621890548,\n\
\ \"acc_norm_stderr\": 0.03535490150137289\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3433734939759036,\n\
\ \"acc_stderr\": 0.03696584317010601,\n \"acc_norm\": 0.3433734939759036,\n\
\ \"acc_norm_stderr\": 0.03696584317010601\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5497076023391813,\n \"acc_stderr\": 0.038158273659132366,\n\
\ \"acc_norm\": 0.5497076023391813,\n \"acc_norm_stderr\": 0.038158273659132366\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35006119951040393,\n\
\ \"mc1_stderr\": 0.016697949420151032,\n \"mc2\": 0.5254779040118085,\n\
\ \"mc2_stderr\": 0.01594442535756773\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/Manticore-13B-Chat-Pyg-Guanaco-SuperHOT-8K-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|arc:challenge|25_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hellaswag|10_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:55:05.081055.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:55:05.081055.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T20:55:05.081055.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T20:55:05.081055.parquet'
- config_name: results
data_files:
- split: 2023_08_29T20_55_05.081055
path:
- results_2023-08-29T20:55:05.081055.parquet
- split: latest
path:
- results_2023-08-29T20:55:05.081055.parquet
---
# Dataset Card for Evaluation run of TheBloke/Manticore-13B-Chat-Pyg-Guanaco-SuperHOT-8K-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Manticore-13B-Chat-Pyg-Guanaco-SuperHOT-8K-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Manticore-13B-Chat-Pyg-Guanaco-SuperHOT-8K-GPTQ](https://huggingface.co/TheBloke/Manticore-13B-Chat-Pyg-Guanaco-SuperHOT-8K-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Manticore-13B-Chat-Pyg-Guanaco-SuperHOT-8K-GPTQ",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T20:55:05.081055](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Manticore-13B-Chat-Pyg-Guanaco-SuperHOT-8K-GPTQ/blob/main/results_2023-08-29T20%3A55%3A05.081055.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4034689621810347,
"acc_stderr": 0.03467877900913612,
"acc_norm": 0.4072485843837375,
"acc_norm_stderr": 0.03466353815247821,
"mc1": 0.35006119951040393,
"mc1_stderr": 0.016697949420151032,
"mc2": 0.5254779040118085,
"mc2_stderr": 0.01594442535756773
},
"harness|arc:challenge|25": {
"acc": 0.5051194539249146,
"acc_stderr": 0.014610624890309157,
"acc_norm": 0.5281569965870307,
"acc_norm_stderr": 0.014588204105102203
},
"harness|hellaswag|10": {
"acc": 0.5962955586536547,
"acc_stderr": 0.00489636818576524,
"acc_norm": 0.7962557259510058,
"acc_norm_stderr": 0.004019578428155064
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.04171654161354543,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.04171654161354543
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.040179012759817494,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.040179012759817494
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4226415094339623,
"acc_stderr": 0.03040233144576954,
"acc_norm": 0.4226415094339623,
"acc_norm_stderr": 0.03040233144576954
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.04140685639111503,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.04140685639111503
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.32947976878612717,
"acc_stderr": 0.03583901754736413,
"acc_norm": 0.32947976878612717,
"acc_norm_stderr": 0.03583901754736413
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3404255319148936,
"acc_stderr": 0.03097669299853442,
"acc_norm": 0.3404255319148936,
"acc_norm_stderr": 0.03097669299853442
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.04372748290278006,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.04372748290278006
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.33793103448275863,
"acc_stderr": 0.039417076320648906,
"acc_norm": 0.33793103448275863,
"acc_norm_stderr": 0.039417076320648906
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918417,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918417
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.0393253768039287,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.0393253768039287
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.42258064516129035,
"acc_stderr": 0.02810096472427264,
"acc_norm": 0.42258064516129035,
"acc_norm_stderr": 0.02810096472427264
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.032257994762334846,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.032257994762334846
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.494949494949495,
"acc_stderr": 0.035621707606254015,
"acc_norm": 0.494949494949495,
"acc_norm_stderr": 0.035621707606254015
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5751295336787565,
"acc_stderr": 0.035674713352125395,
"acc_norm": 0.5751295336787565,
"acc_norm_stderr": 0.035674713352125395
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3871794871794872,
"acc_stderr": 0.02469721693087894,
"acc_norm": 0.3871794871794872,
"acc_norm_stderr": 0.02469721693087894
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.36134453781512604,
"acc_stderr": 0.03120469122515002,
"acc_norm": 0.36134453781512604,
"acc_norm_stderr": 0.03120469122515002
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.03479185572599661,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.03479185572599661
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5302752293577981,
"acc_stderr": 0.021397988604936965,
"acc_norm": 0.5302752293577981,
"acc_norm_stderr": 0.021397988604936965
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.27314814814814814,
"acc_stderr": 0.03038805130167812,
"acc_norm": 0.27314814814814814,
"acc_norm_stderr": 0.03038805130167812
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.030778554678693264,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.030778554678693264
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5611814345991561,
"acc_stderr": 0.032302649315470375,
"acc_norm": 0.5611814345991561,
"acc_norm_stderr": 0.032302649315470375
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5381165919282511,
"acc_stderr": 0.033460150119732274,
"acc_norm": 0.5381165919282511,
"acc_norm_stderr": 0.033460150119732274
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4580152671755725,
"acc_stderr": 0.04369802690578757,
"acc_norm": 0.4580152671755725,
"acc_norm_stderr": 0.04369802690578757
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.48760330578512395,
"acc_stderr": 0.04562951548180765,
"acc_norm": 0.48760330578512395,
"acc_norm_stderr": 0.04562951548180765
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.44785276073619634,
"acc_stderr": 0.03906947479456602,
"acc_norm": 0.44785276073619634,
"acc_norm_stderr": 0.03906947479456602
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467762,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467762
},
"harness|hendrycksTest-management|5": {
"acc": 0.49514563106796117,
"acc_stderr": 0.049505043821289195,
"acc_norm": 0.49514563106796117,
"acc_norm_stderr": 0.049505043821289195
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6752136752136753,
"acc_stderr": 0.03067902276549883,
"acc_norm": 0.6752136752136753,
"acc_norm_stderr": 0.03067902276549883
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5708812260536399,
"acc_stderr": 0.01769938848312679,
"acc_norm": 0.5708812260536399,
"acc_norm_stderr": 0.01769938848312679
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.43352601156069365,
"acc_stderr": 0.026680134761679217,
"acc_norm": 0.43352601156069365,
"acc_norm_stderr": 0.026680134761679217
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2536312849162011,
"acc_stderr": 0.014551553659369922,
"acc_norm": 0.2536312849162011,
"acc_norm_stderr": 0.014551553659369922
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.028074158947600653,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.028074158947600653
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.45980707395498394,
"acc_stderr": 0.028306190403305693,
"acc_norm": 0.45980707395498394,
"acc_norm_stderr": 0.028306190403305693
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4660493827160494,
"acc_stderr": 0.02775653525734767,
"acc_norm": 0.4660493827160494,
"acc_norm_stderr": 0.02775653525734767
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.32269503546099293,
"acc_stderr": 0.02788913930053479,
"acc_norm": 0.32269503546099293,
"acc_norm_stderr": 0.02788913930053479
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.35658409387222945,
"acc_stderr": 0.012233642989273891,
"acc_norm": 0.35658409387222945,
"acc_norm_stderr": 0.012233642989273891
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.39338235294117646,
"acc_stderr": 0.02967428828131118,
"acc_norm": 0.39338235294117646,
"acc_norm_stderr": 0.02967428828131118
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4150326797385621,
"acc_stderr": 0.019933627776857425,
"acc_norm": 0.4150326797385621,
"acc_norm_stderr": 0.019933627776857425
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.44545454545454544,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.44545454545454544,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.031680911612338825,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.031680911612338825
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5024875621890548,
"acc_stderr": 0.03535490150137289,
"acc_norm": 0.5024875621890548,
"acc_norm_stderr": 0.03535490150137289
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3433734939759036,
"acc_stderr": 0.03696584317010601,
"acc_norm": 0.3433734939759036,
"acc_norm_stderr": 0.03696584317010601
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5497076023391813,
"acc_stderr": 0.038158273659132366,
"acc_norm": 0.5497076023391813,
"acc_norm_stderr": 0.038158273659132366
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35006119951040393,
"mc1_stderr": 0.016697949420151032,
"mc2": 0.5254779040118085,
"mc2_stderr": 0.01594442535756773
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
huggingface-course/audio-course-bark-samples | 2023-08-29T21:30:07.000Z | [
"region:us"
] | huggingface-course | null | null | null | 0 | 0 | Entry not found |
valhalla/sketch-images | 2023-08-29T21:26:54.000Z | [
"region:us"
] | valhalla | null | null | null | 0 | 0 | Entry not found |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.