id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
sooyeon/autotrain-data-flan-t5-large-financial-phrasebank-lora | 2023-10-09T13:51:35.000Z | [
"region:us"
] | sooyeon | null | null | null | 0 | 0 | Entry not found |
jamsonE/myself | 2023-10-09T13:52:53.000Z | [
"region:us"
] | jamsonE | null | null | null | 0 | 0 | Entry not found |
harinarayan/my_tiny_dataset | 2023-10-09T13:59:03.000Z | [
"region:us"
] | harinarayan | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 445121.0
num_examples: 8
download_size: 0
dataset_size: 445121.0
---
# Dataset Card for "my_tiny_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jamsonE/doc | 2023-10-09T13:54:50.000Z | [
"region:us"
] | jamsonE | null | null | null | 0 | 0 | Entry not found |
result-muse256-muse512-wuerst-sdv15/18cadc88 | 2023-10-09T14:02:51.000Z | [
"region:us"
] | result-muse256-muse512-wuerst-sdv15 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 191
num_examples: 10
download_size: 1352
dataset_size: 191
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "18cadc88"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
04RR/tiny-instruct | 2023-10-09T17:09:18.000Z | [
"task_categories:text-generation",
"size_categories:1M<n<10M",
"language:en",
"license:apache-2.0",
"region:us"
] | 04RR | null | null | null | 9 | 0 | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
size_categories:
- 1M<n<10M
pretty_name: tiny-instruct
---
# tiny-instruct-v1
This dataset is collated from multiple other open-source datasets. This has a total of 8.5M rows each with an instruction and response.
#### Code Datasets:
1. [CodeAlpaca_20K](https://huggingface.co/datasets/HuggingFaceH4/CodeAlpaca_20K)
2. [CodeExercise-Python-27k](https://huggingface.co/datasets/codefuse-ai/CodeExercise-Python-27k)
3. [Evol-Instruct-Code-80k-v1](https://huggingface.co/datasets/nickrosh/Evol-Instruct-Code-80k-v1)
4. [tiny-codes](https://huggingface.co/datasets/nampdn-ai/tiny-codes)
5. [Evol-instruction-66k](https://huggingface.co/datasets/codefuse-ai/Evol-instruction-66k)
6. [sciphi-python-textbook](https://huggingface.co/datasets/emrgnt-cmplxty/sciphi-python-textbook)
7. [programming_books_llama](https://huggingface.co/datasets/open-phi/programming_books_llama)
8. [WizardLM_evol_instruct_70k](https://huggingface.co/datasets/WizardLM/WizardLM_evol_instruct_70k)
#### Math Datasets:
1. [MetaMathQA](https://huggingface.co/datasets/meta-math/MetaMathQA)
2. [arxiv-math-instruct-50k](https://huggingface.co/datasets/ArtifactAI/arxiv-math-instruct-50k)
3. [MathInstruct](https://huggingface.co/datasets/TIGER-Lab/MathInstruct)
#### General Datasets:
1. [OpenOrca](https://huggingface.co/datasets/Open-Orca/OpenOrca)
2. [claude_evol_instruct_210k](https://huggingface.co/datasets/Norquinal/claude_evol_instruct_210k) |
result-muse256-muse512-wuerst-sdv15/ef1e42ce | 2023-10-09T14:07:23.000Z | [
"region:us"
] | result-muse256-muse512-wuerst-sdv15 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 202
num_examples: 10
download_size: 1381
dataset_size: 202
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ef1e42ce"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CATTAC/extended_cuboid | 2023-10-09T14:29:44.000Z | [
"license:apache-2.0",
"region:us"
] | CATTAC | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
YSFF/my_loras | 2023-10-09T14:46:28.000Z | [
"region:us"
] | YSFF | null | null | null | 0 | 0 | Entry not found |
Tsuinzues/wellington-lima | 2023-10-09T14:50:23.000Z | [
"license:openrail",
"region:us"
] | Tsuinzues | null | null | null | 0 | 0 | ---
license: openrail
---
|
W1lson/testt2 | 2023-10-09T14:58:09.000Z | [
"region:us"
] | W1lson | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: Category
dtype: string
- name: Description
dtype: string
splits:
- name: train
num_bytes: 383
num_examples: 5
download_size: 1879
dataset_size: 383
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "testt2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
RorooroR/BossaNova | 2023-10-09T17:13:19.000Z | [
"region:us"
] | RorooroR | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: image
dtype: image
- name: audio_file
dtype: string
- name: slice
dtype: int16
splits:
- name: train
num_bytes: 1147109518.125
num_examples: 27791
download_size: 1143310714
dataset_size: 1147109518.125
---
# Dataset Card for "BossaNova"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
andreabac3/mmlu_ita | 2023-10-10T15:22:33.000Z | [
"region:us"
] | andreabac3 | null | null | null | 0 | 0 | ---
dataset_info:
- config_name: abstract_algebra
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: modified
dtype: bool
splits:
- name: test
num_bytes: 23200
num_examples: 100
- name: validation
num_bytes: 2338
num_examples: 11
- name: dev
num_bytes: 997
num_examples: 5
download_size: 0
dataset_size: 26535
- config_name: anatomy
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: modified
dtype: bool
splits:
- name: test
num_bytes: 35503
num_examples: 135
- name: validation
num_bytes: 3542
num_examples: 14
- name: dev
num_bytes: 1068
num_examples: 5
download_size: 0
dataset_size: 40113
- config_name: astronomy
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: modified
dtype: bool
splits:
- name: test
num_bytes: 50897
num_examples: 152
- name: validation
num_bytes: 5753
num_examples: 16
- name: dev
num_bytes: 2361
num_examples: 5
download_size: 0
dataset_size: 59011
- config_name: business_ethics
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: modified
dtype: bool
splits:
- name: test
num_bytes: 51712
num_examples: 100
- name: validation
num_bytes: 8126
num_examples: 11
- name: dev
num_bytes: 3667
num_examples: 5
download_size: 0
dataset_size: 63505
- config_name: clinical_knowledge
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: modified
dtype: bool
splits:
- name: test
num_bytes: 76158
num_examples: 265
- name: validation
num_bytes: 8010
num_examples: 29
- name: dev
num_bytes: 1405
num_examples: 5
download_size: 0
dataset_size: 85573
- config_name: college_biology
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: modified
dtype: bool
splits:
- name: test
num_bytes: 52282
num_examples: 144
- name: validation
num_bytes: 5297
num_examples: 16
- name: dev
num_bytes: 1673
num_examples: 5
download_size: 0
dataset_size: 59252
- config_name: college_chemistry
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: modified
dtype: bool
splits:
- name: test
num_bytes: 27954
num_examples: 100
- name: validation
num_bytes: 2377
num_examples: 8
- name: dev
num_bytes: 1396
num_examples: 5
download_size: 0
dataset_size: 31727
- config_name: college_computer_science
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: modified
dtype: bool
splits:
- name: test
num_bytes: 44978
num_examples: 100
- name: validation
num_bytes: 5140
num_examples: 11
- name: dev
num_bytes: 2793
num_examples: 5
download_size: 0
dataset_size: 52911
- config_name: college_mathematics
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: modified
dtype: bool
splits:
- name: test
num_bytes: 26688
num_examples: 100
- name: validation
num_bytes: 2808
num_examples: 11
- name: dev
num_bytes: 1702
num_examples: 5
download_size: 0
dataset_size: 31198
- config_name: college_medicine
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: modified
dtype: bool
splits:
- name: test
num_bytes: 72529
num_examples: 173
- name: validation
num_bytes: 7584
num_examples: 22
- name: dev
num_bytes: 1972
num_examples: 5
download_size: 0
dataset_size: 82085
- config_name: college_physics
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: modified
dtype: bool
splits:
- name: test
num_bytes: 30955
num_examples: 102
- name: validation
num_bytes: 3328
num_examples: 11
- name: dev
num_bytes: 1171
num_examples: 5
download_size: 0
dataset_size: 35454
- config_name: computer_security
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: modified
dtype: bool
splits:
- name: test
num_bytes: 32036
num_examples: 100
- name: validation
num_bytes: 4426
num_examples: 11
- name: dev
num_bytes: 1348
num_examples: 5
download_size: 0
dataset_size: 37810
- config_name: conceptual_physics
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: modified
dtype: bool
splits:
- name: test
num_bytes: 49517
num_examples: 235
- name: validation
num_bytes: 5335
num_examples: 26
- name: dev
num_bytes: 1019
num_examples: 5
download_size: 0
dataset_size: 55871
- config_name: econometrics
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: modified
dtype: bool
splits:
- name: test
num_bytes: 50915
num_examples: 114
- name: validation
num_bytes: 5322
num_examples: 12
- name: dev
num_bytes: 1829
num_examples: 5
download_size: 0
dataset_size: 58066
- config_name: electrical_engineering
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: modified
dtype: bool
splits:
- name: test
num_bytes: 31418
num_examples: 145
- name: validation
num_bytes: 3546
num_examples: 16
- name: dev
num_bytes: 1107
num_examples: 5
download_size: 0
dataset_size: 36071
- config_name: elementary_mathematics
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: modified
dtype: bool
splits:
- name: test
num_bytes: 77471
num_examples: 378
- name: validation
num_bytes: 10370
num_examples: 41
- name: dev
num_bytes: 1488
num_examples: 5
download_size: 0
dataset_size: 89329
- config_name: formal_logic
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: modified
dtype: bool
splits:
- name: test
num_bytes: 54923
num_examples: 126
- name: validation
num_bytes: 6116
num_examples: 14
- name: dev
num_bytes: 1747
num_examples: 5
download_size: 0
dataset_size: 62786
- config_name: global_facts
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: modified
dtype: bool
splits:
- name: test
num_bytes: 22321
num_examples: 100
- name: validation
num_bytes: 2089
num_examples: 10
- name: dev
num_bytes: 1355
num_examples: 5
download_size: 0
dataset_size: 25765
- config_name: high_school_biology
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: modified
dtype: bool
splits:
- name: test
num_bytes: 116214
num_examples: 310
- name: validation
num_bytes: 11726
num_examples: 32
- name: dev
num_bytes: 1755
num_examples: 5
download_size: 0
dataset_size: 129695
- config_name: high_school_chemistry
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: modified
dtype: bool
splits:
- name: test
num_bytes: 63398
num_examples: 203
- name: validation
num_bytes: 7727
num_examples: 22
- name: dev
num_bytes: 1198
num_examples: 5
download_size: 0
dataset_size: 72323
- config_name: high_school_computer_science
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: modified
dtype: bool
splits:
- name: test
num_bytes: 47681
num_examples: 100
- name: validation
num_bytes: 5105
num_examples: 9
- name: dev
num_bytes: 3080
num_examples: 5
download_size: 0
dataset_size: 55866
- config_name: high_school_european_history
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: modified
dtype: bool
splits:
- name: test
num_bytes: 195964
num_examples: 165
- name: validation
num_bytes: 20338
num_examples: 18
- name: dev
num_bytes: 9816
num_examples: 5
download_size: 0
dataset_size: 226118
- config_name: high_school_geography
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: modified
dtype: bool
splits:
- name: test
num_bytes: 51366
num_examples: 198
- name: validation
num_bytes: 5161
num_examples: 22
- name: dev
num_bytes: 1764
num_examples: 5
download_size: 0
dataset_size: 58291
- config_name: high_school_government_and_politics
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: modified
dtype: bool
splits:
- name: test
num_bytes: 80631
num_examples: 193
- name: validation
num_bytes: 8631
num_examples: 21
- name: dev
num_bytes: 2143
num_examples: 5
download_size: 0
dataset_size: 91405
- config_name: high_school_macroeconomics
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: modified
dtype: bool
splits:
- name: test
num_bytes: 139917
num_examples: 390
- name: validation
num_bytes: 15649
num_examples: 43
- name: dev
num_bytes: 1654
num_examples: 5
download_size: 0
dataset_size: 157220
- config_name: high_school_mathematics
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: modified
dtype: bool
splits:
- name: test
num_bytes: 59753
num_examples: 270
- name: validation
num_bytes: 6463
num_examples: 29
- name: dev
num_bytes: 1344
num_examples: 5
download_size: 0
dataset_size: 67560
- config_name: high_school_microeconomics
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: modified
dtype: bool
splits:
- name: test
num_bytes: 89562
num_examples: 238
- name: validation
num_bytes: 9374
num_examples: 26
- name: dev
num_bytes: 1541
num_examples: 5
download_size: 0
dataset_size: 100477
- config_name: high_school_physics
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: modified
dtype: bool
splits:
- name: test
num_bytes: 61509
num_examples: 151
- name: validation
num_bytes: 6857
num_examples: 17
- name: dev
num_bytes: 1439
num_examples: 5
download_size: 0
dataset_size: 69805
- config_name: high_school_psychology
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: modified
dtype: bool
splits:
- name: test
num_bytes: 174477
num_examples: 545
- name: validation
num_bytes: 18714
num_examples: 60
- name: dev
num_bytes: 1947
num_examples: 5
download_size: 5339
dataset_size: 195138
- config_name: high_school_statistics
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: modified
dtype: bool
splits:
- name: test
num_bytes: 114353
num_examples: 216
- name: validation
num_bytes: 10677
num_examples: 23
- name: dev
num_bytes: 2483
num_examples: 5
download_size: 77040
dataset_size: 127513
- config_name: high_school_us_history
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: modified
dtype: bool
splits:
- name: test
num_bytes: 247296
num_examples: 204
- name: validation
num_bytes: 26784
num_examples: 22
- name: dev
num_bytes: 7808
num_examples: 5
download_size: 152191
dataset_size: 281888
- config_name: high_school_world_history
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: modified
dtype: bool
splits:
- name: test
num_bytes: 270559
num_examples: 237
- name: validation
num_bytes: 30524
num_examples: 26
- name: dev
num_bytes: 4152
num_examples: 5
download_size: 151829
dataset_size: 305235
- config_name: human_aging
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 54554
num_examples: 223
- name: validation
num_bytes: 5658
num_examples: 23
- name: dev
num_bytes: 1156
num_examples: 5
download_size: 44359
dataset_size: 61368
- config_name: human_sexuality
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 35263
num_examples: 131
- name: validation
num_bytes: 2744
num_examples: 12
- name: dev
num_bytes: 1246
num_examples: 5
download_size: 33418
dataset_size: 39253
- config_name: international_law
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 62507
num_examples: 121
- name: validation
num_bytes: 7326
num_examples: 13
- name: dev
num_bytes: 2759
num_examples: 5
download_size: 46021
dataset_size: 72592
- config_name: jurisprudence
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 38263
num_examples: 108
- name: validation
num_bytes: 4065
num_examples: 11
- name: dev
num_bytes: 1467
num_examples: 5
download_size: 34856
dataset_size: 43795
- config_name: logical_fallacies
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 59713
num_examples: 163
- name: validation
num_bytes: 5754
num_examples: 18
- name: dev
num_bytes: 1769
num_examples: 5
download_size: 36532
dataset_size: 67236
- config_name: machine_learning
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 38897
num_examples: 112
- name: validation
num_bytes: 3849
num_examples: 11
- name: dev
num_bytes: 2398
num_examples: 5
download_size: 32431
dataset_size: 45144
- config_name: management
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 23563
num_examples: 103
- name: validation
num_bytes: 2121
num_examples: 11
- name: dev
num_bytes: 1031
num_examples: 5
download_size: 23715
dataset_size: 26715
- config_name: marketing
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 70232
num_examples: 234
- name: validation
num_bytes: 8114
num_examples: 25
- name: dev
num_bytes: 1745
num_examples: 5
download_size: 52169
dataset_size: 80091
- config_name: medical_genetics
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 24346
num_examples: 100
- name: validation
num_bytes: 3446
num_examples: 11
- name: dev
num_bytes: 1251
num_examples: 5
download_size: 26431
dataset_size: 29043
- config_name: miscellaneous
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 168402
num_examples: 783
- name: validation
num_bytes: 16526
num_examples: 86
- name: dev
num_bytes: 830
num_examples: 5
download_size: 120284
dataset_size: 185758
- config_name: moral_disputes
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 121885
num_examples: 346
- name: validation
num_bytes: 13975
num_examples: 38
- name: dev
num_bytes: 1886
num_examples: 5
download_size: 82243
dataset_size: 137746
- config_name: moral_scenarios
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 462681
num_examples: 895
- name: validation
num_bytes: 52216
num_examples: 100
- name: dev
num_bytes: 2408
num_examples: 5
download_size: 124698
dataset_size: 517305
- config_name: nutrition
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 105476
num_examples: 306
- name: validation
num_bytes: 9873
num_examples: 33
- name: dev
num_bytes: 2315
num_examples: 5
download_size: 71858
dataset_size: 117664
- config_name: philosophy
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 88744
num_examples: 311
- name: validation
num_bytes: 10024
num_examples: 34
- name: dev
num_bytes: 1141
num_examples: 5
download_size: 63473
dataset_size: 99909
- config_name: prehistory
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 99975
num_examples: 324
- name: validation
num_bytes: 11132
num_examples: 35
- name: dev
num_bytes: 2066
num_examples: 5
download_size: 74487
dataset_size: 113173
- config_name: professional_accounting
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 132812
num_examples: 282
- name: validation
num_bytes: 16377
num_examples: 31
- name: dev
num_bytes: 2369
num_examples: 5
download_size: 90091
dataset_size: 151558
- config_name: professional_law
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 1520402
num_examples: 1534
- name: validation
num_bytes: 169655
num_examples: 170
- name: dev
num_bytes: 5440
num_examples: 5
download_size: 923751
dataset_size: 1695497
- config_name: professional_medicine
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 181007
num_examples: 272
- name: validation
num_bytes: 21418
num_examples: 31
- name: dev
num_bytes: 3163
num_examples: 5
download_size: 124049
dataset_size: 205588
- config_name: professional_psychology
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 249181
num_examples: 612
- name: validation
num_bytes: 31436
num_examples: 69
- name: dev
num_bytes: 2144
num_examples: 5
download_size: 162179
dataset_size: 282761
- config_name: public_relations
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 32432
num_examples: 110
- name: validation
num_bytes: 5102
num_examples: 12
- name: dev
num_bytes: 1722
num_examples: 5
download_size: 32499
dataset_size: 39256
- config_name: security_studies
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 191861
num_examples: 245
- name: validation
num_bytes: 22973
num_examples: 27
- name: dev
num_bytes: 4305
num_examples: 5
download_size: 126797
dataset_size: 219139
- config_name: sociology
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 74793
num_examples: 201
- name: validation
num_bytes: 8167
num_examples: 22
- name: dev
num_bytes: 1857
num_examples: 5
download_size: 60326
dataset_size: 84817
- config_name: us_foreign_policy
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 34225
num_examples: 100
- name: validation
num_bytes: 3797
num_examples: 11
- name: dev
num_bytes: 1981
num_examples: 5
download_size: 31778
dataset_size: 40003
- config_name: virology
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 44038
num_examples: 166
- name: validation
num_bytes: 5758
num_examples: 18
- name: dev
num_bytes: 1259
num_examples: 5
download_size: 41326
dataset_size: 51055
- config_name: world_religions
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 29376
num_examples: 171
- name: validation
num_bytes: 3242
num_examples: 19
- name: dev
num_bytes: 792
num_examples: 5
download_size: 28314
dataset_size: 33410
configs:
- config_name: abstract_algebra
data_files:
- split: test
path: abstract_algebra/test-*
- split: validation
path: abstract_algebra/validation-*
- split: dev
path: abstract_algebra/dev-*
- config_name: anatomy
data_files:
- split: test
path: anatomy/test-*
- split: validation
path: anatomy/validation-*
- split: dev
path: anatomy/dev-*
- config_name: astronomy
data_files:
- split: test
path: astronomy/test-*
- split: validation
path: astronomy/validation-*
- split: dev
path: astronomy/dev-*
- config_name: business_ethics
data_files:
- split: test
path: business_ethics/test-*
- split: validation
path: business_ethics/validation-*
- split: dev
path: business_ethics/dev-*
- config_name: clinical_knowledge
data_files:
- split: test
path: clinical_knowledge/test-*
- split: validation
path: clinical_knowledge/validation-*
- split: dev
path: clinical_knowledge/dev-*
- config_name: college_biology
data_files:
- split: test
path: college_biology/test-*
- split: validation
path: college_biology/validation-*
- split: dev
path: college_biology/dev-*
- config_name: college_chemistry
data_files:
- split: test
path: college_chemistry/test-*
- split: validation
path: college_chemistry/validation-*
- split: dev
path: college_chemistry/dev-*
- config_name: college_computer_science
data_files:
- split: test
path: college_computer_science/test-*
- split: validation
path: college_computer_science/validation-*
- split: dev
path: college_computer_science/dev-*
- config_name: college_mathematics
data_files:
- split: test
path: college_mathematics/test-*
- split: validation
path: college_mathematics/validation-*
- split: dev
path: college_mathematics/dev-*
- config_name: college_medicine
data_files:
- split: test
path: college_medicine/test-*
- split: validation
path: college_medicine/validation-*
- split: dev
path: college_medicine/dev-*
- config_name: college_physics
data_files:
- split: test
path: college_physics/test-*
- split: validation
path: college_physics/validation-*
- split: dev
path: college_physics/dev-*
- config_name: computer_security
data_files:
- split: test
path: computer_security/test-*
- split: validation
path: computer_security/validation-*
- split: dev
path: computer_security/dev-*
- config_name: conceptual_physics
data_files:
- split: test
path: conceptual_physics/test-*
- split: validation
path: conceptual_physics/validation-*
- split: dev
path: conceptual_physics/dev-*
- config_name: econometrics
data_files:
- split: test
path: econometrics/test-*
- split: validation
path: econometrics/validation-*
- split: dev
path: econometrics/dev-*
- config_name: electrical_engineering
data_files:
- split: test
path: electrical_engineering/test-*
- split: validation
path: electrical_engineering/validation-*
- split: dev
path: electrical_engineering/dev-*
- config_name: elementary_mathematics
data_files:
- split: test
path: elementary_mathematics/test-*
- split: validation
path: elementary_mathematics/validation-*
- split: dev
path: elementary_mathematics/dev-*
- config_name: formal_logic
data_files:
- split: test
path: formal_logic/test-*
- split: validation
path: formal_logic/validation-*
- split: dev
path: formal_logic/dev-*
- config_name: global_facts
data_files:
- split: test
path: global_facts/test-*
- split: validation
path: global_facts/validation-*
- split: dev
path: global_facts/dev-*
- config_name: high_school_biology
data_files:
- split: test
path: high_school_biology/test-*
- split: validation
path: high_school_biology/validation-*
- split: dev
path: high_school_biology/dev-*
- config_name: high_school_chemistry
data_files:
- split: test
path: high_school_chemistry/test-*
- split: validation
path: high_school_chemistry/validation-*
- split: dev
path: high_school_chemistry/dev-*
- config_name: high_school_computer_science
data_files:
- split: test
path: high_school_computer_science/test-*
- split: validation
path: high_school_computer_science/validation-*
- split: dev
path: high_school_computer_science/dev-*
- config_name: high_school_european_history
data_files:
- split: test
path: high_school_european_history/test-*
- split: validation
path: high_school_european_history/validation-*
- split: dev
path: high_school_european_history/dev-*
- config_name: high_school_geography
data_files:
- split: test
path: high_school_geography/test-*
- split: validation
path: high_school_geography/validation-*
- split: dev
path: high_school_geography/dev-*
- config_name: high_school_government_and_politics
data_files:
- split: test
path: high_school_government_and_politics/test-*
- split: validation
path: high_school_government_and_politics/validation-*
- split: dev
path: high_school_government_and_politics/dev-*
- config_name: high_school_macroeconomics
data_files:
- split: test
path: high_school_macroeconomics/test-*
- split: validation
path: high_school_macroeconomics/validation-*
- split: dev
path: high_school_macroeconomics/dev-*
- config_name: high_school_mathematics
data_files:
- split: test
path: high_school_mathematics/test-*
- split: validation
path: high_school_mathematics/validation-*
- split: dev
path: high_school_mathematics/dev-*
- config_name: high_school_microeconomics
data_files:
- split: test
path: high_school_microeconomics/test-*
- split: validation
path: high_school_microeconomics/validation-*
- split: dev
path: high_school_microeconomics/dev-*
- config_name: high_school_physics
data_files:
- split: test
path: high_school_physics/test-*
- split: validation
path: high_school_physics/validation-*
- split: dev
path: high_school_physics/dev-*
- config_name: high_school_psychology
data_files:
- split: test
path: high_school_psychology/test-*
- split: validation
path: high_school_psychology/validation-*
- split: dev
path: high_school_psychology/dev-*
- config_name: high_school_statistics
data_files:
- split: test
path: high_school_statistics/test-*
- split: validation
path: high_school_statistics/validation-*
- split: dev
path: high_school_statistics/dev-*
- config_name: high_school_us_history
data_files:
- split: test
path: high_school_us_history/test-*
- split: validation
path: high_school_us_history/validation-*
- split: dev
path: high_school_us_history/dev-*
- config_name: high_school_world_history
data_files:
- split: test
path: high_school_world_history/test-*
- split: validation
path: high_school_world_history/validation-*
- split: dev
path: high_school_world_history/dev-*
- config_name: human_aging
data_files:
- split: test
path: human_aging/test-*
- split: validation
path: human_aging/validation-*
- split: dev
path: human_aging/dev-*
- config_name: human_sexuality
data_files:
- split: test
path: human_sexuality/test-*
- split: validation
path: human_sexuality/validation-*
- split: dev
path: human_sexuality/dev-*
- config_name: international_law
data_files:
- split: test
path: international_law/test-*
- split: validation
path: international_law/validation-*
- split: dev
path: international_law/dev-*
- config_name: jurisprudence
data_files:
- split: test
path: jurisprudence/test-*
- split: validation
path: jurisprudence/validation-*
- split: dev
path: jurisprudence/dev-*
- config_name: logical_fallacies
data_files:
- split: test
path: logical_fallacies/test-*
- split: validation
path: logical_fallacies/validation-*
- split: dev
path: logical_fallacies/dev-*
- config_name: machine_learning
data_files:
- split: test
path: machine_learning/test-*
- split: validation
path: machine_learning/validation-*
- split: dev
path: machine_learning/dev-*
- config_name: management
data_files:
- split: test
path: management/test-*
- split: validation
path: management/validation-*
- split: dev
path: management/dev-*
- config_name: marketing
data_files:
- split: test
path: marketing/test-*
- split: validation
path: marketing/validation-*
- split: dev
path: marketing/dev-*
- config_name: medical_genetics
data_files:
- split: test
path: medical_genetics/test-*
- split: validation
path: medical_genetics/validation-*
- split: dev
path: medical_genetics/dev-*
- config_name: miscellaneous
data_files:
- split: test
path: miscellaneous/test-*
- split: validation
path: miscellaneous/validation-*
- split: dev
path: miscellaneous/dev-*
- config_name: moral_disputes
data_files:
- split: test
path: moral_disputes/test-*
- split: validation
path: moral_disputes/validation-*
- split: dev
path: moral_disputes/dev-*
- config_name: moral_scenarios
data_files:
- split: test
path: moral_scenarios/test-*
- split: validation
path: moral_scenarios/validation-*
- split: dev
path: moral_scenarios/dev-*
- config_name: nutrition
data_files:
- split: test
path: nutrition/test-*
- split: validation
path: nutrition/validation-*
- split: dev
path: nutrition/dev-*
- config_name: philosophy
data_files:
- split: test
path: philosophy/test-*
- split: validation
path: philosophy/validation-*
- split: dev
path: philosophy/dev-*
- config_name: prehistory
data_files:
- split: test
path: prehistory/test-*
- split: validation
path: prehistory/validation-*
- split: dev
path: prehistory/dev-*
- config_name: professional_accounting
data_files:
- split: test
path: professional_accounting/test-*
- split: validation
path: professional_accounting/validation-*
- split: dev
path: professional_accounting/dev-*
- config_name: professional_law
data_files:
- split: test
path: professional_law/test-*
- split: validation
path: professional_law/validation-*
- split: dev
path: professional_law/dev-*
- config_name: professional_medicine
data_files:
- split: test
path: professional_medicine/test-*
- split: validation
path: professional_medicine/validation-*
- split: dev
path: professional_medicine/dev-*
- config_name: professional_psychology
data_files:
- split: test
path: professional_psychology/test-*
- split: validation
path: professional_psychology/validation-*
- split: dev
path: professional_psychology/dev-*
- config_name: public_relations
data_files:
- split: test
path: public_relations/test-*
- split: validation
path: public_relations/validation-*
- split: dev
path: public_relations/dev-*
- config_name: security_studies
data_files:
- split: test
path: security_studies/test-*
- split: validation
path: security_studies/validation-*
- split: dev
path: security_studies/dev-*
- config_name: sociology
data_files:
- split: test
path: sociology/test-*
- split: validation
path: sociology/validation-*
- split: dev
path: sociology/dev-*
- config_name: us_foreign_policy
data_files:
- split: test
path: us_foreign_policy/test-*
- split: validation
path: us_foreign_policy/validation-*
- split: dev
path: us_foreign_policy/dev-*
- config_name: virology
data_files:
- split: test
path: virology/test-*
- split: validation
path: virology/validation-*
- split: dev
path: virology/dev-*
- config_name: world_religions
data_files:
- split: test
path: world_religions/test-*
- split: validation
path: world_religions/validation-*
- split: dev
path: world_religions/dev-*
---
# Dataset Card for "mmlu_ita"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NikitaO/xix3d_v3_cluster_7 | 2023-10-09T15:13:21.000Z | [
"region:us"
] | NikitaO | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 16979523.0
num_examples: 278
download_size: 16649608
dataset_size: 16979523.0
---
# Dataset Card for "xix3d_v3_cluster_7"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NikitaO/xix3d_v3_cluster_9 | 2023-10-09T15:13:23.000Z | [
"region:us"
] | NikitaO | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 5083851.0
num_examples: 79
download_size: 5017845
dataset_size: 5083851.0
---
# Dataset Card for "xix3d_v3_cluster_9"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NikitaO/xix3d_v3_cluster_10 | 2023-10-09T15:13:26.000Z | [
"region:us"
] | NikitaO | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 21108296.0
num_examples: 241
download_size: 21027307
dataset_size: 21108296.0
---
# Dataset Card for "xix3d_v3_cluster_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NikitaO/xix3d_v3_cluster_14 | 2023-10-09T15:13:29.000Z | [
"region:us"
] | NikitaO | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 7412728.0
num_examples: 94
download_size: 7249490
dataset_size: 7412728.0
---
# Dataset Card for "xix3d_v3_cluster_14"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NikitaO/xix3d_v3_cluster_2 | 2023-10-09T15:13:32.000Z | [
"region:us"
] | NikitaO | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 28048910.0
num_examples: 143
download_size: 28045400
dataset_size: 28048910.0
---
# Dataset Card for "xix3d_v3_cluster_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NikitaO/xix3d_v3_cluster_13 | 2023-10-09T15:13:38.000Z | [
"region:us"
] | NikitaO | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 11243448.0
num_examples: 171
download_size: 11177141
dataset_size: 11243448.0
---
# Dataset Card for "xix3d_v3_cluster_13"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NikitaO/xix3d_v3_cluster_6 | 2023-10-09T15:13:40.000Z | [
"region:us"
] | NikitaO | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 10196292.0
num_examples: 189
download_size: 10168132
dataset_size: 10196292.0
---
# Dataset Card for "xix3d_v3_cluster_6"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NikitaO/xix3d_v3_cluster_5 | 2023-10-09T15:13:43.000Z | [
"region:us"
] | NikitaO | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 17689968.0
num_examples: 135
download_size: 17687686
dataset_size: 17689968.0
---
# Dataset Card for "xix3d_v3_cluster_5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NikitaO/xix3d_v3_cluster_4 | 2023-10-09T15:13:47.000Z | [
"region:us"
] | NikitaO | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 22367249.0
num_examples: 256
download_size: 22156260
dataset_size: 22367249.0
---
# Dataset Card for "xix3d_v3_cluster_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NikitaO/xix3d_v3_cluster_0 | 2023-10-09T15:13:49.000Z | [
"region:us"
] | NikitaO | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 9439559.0
num_examples: 62
download_size: 9439717
dataset_size: 9439559.0
---
# Dataset Card for "xix3d_v3_cluster_0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NikitaO/xix3d_v3_cluster_8 | 2023-10-09T15:13:52.000Z | [
"region:us"
] | NikitaO | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 14373428.0
num_examples: 180
download_size: 14288874
dataset_size: 14373428.0
---
# Dataset Card for "xix3d_v3_cluster_8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NikitaO/xix3d_v3_cluster_1 | 2023-10-09T15:13:56.000Z | [
"region:us"
] | NikitaO | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 16217299.0
num_examples: 115
download_size: 16207892
dataset_size: 16217299.0
---
# Dataset Card for "xix3d_v3_cluster_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NikitaO/xix3d_v3_cluster_12 | 2023-10-09T15:13:58.000Z | [
"region:us"
] | NikitaO | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 8530186.0
num_examples: 121
download_size: 8461385
dataset_size: 8530186.0
---
# Dataset Card for "xix3d_v3_cluster_12"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NikitaO/xix3d_v3_cluster_11 | 2023-10-09T15:14:01.000Z | [
"region:us"
] | NikitaO | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 11628474.0
num_examples: 119
download_size: 11615236
dataset_size: 11628474.0
---
# Dataset Card for "xix3d_v3_cluster_11"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Globaly/familias5k | 2023-10-09T15:17:31.000Z | [
"region:us"
] | Globaly | null | null | null | 0 | 0 | Entry not found |
selinerdem/test-german-orca | 2023-10-09T15:18:27.000Z | [
"region:us"
] | selinerdem | null | null | null | 0 | 0 | ---
# For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/datasetcard.md?plain=1
# Doc / guide: https://huggingface.co/docs/hub/datasets-cards
{}
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
jpiorko/genai_survey_dataset | 2023-10-09T15:18:57.000Z | [
"region:us"
] | jpiorko | null | null | null | 0 | 0 | Entry not found |
trappy/ditspsyditsduck | 2023-10-09T15:32:55.000Z | [
"region:us"
] | trappy | null | null | null | 0 | 0 | Entry not found |
Globaly/clases21k | 2023-10-09T15:49:09.000Z | [
"region:us"
] | Globaly | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_budecosystem__boomer-1b | 2023-10-09T15:39:00.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of budecosystem/boomer-1b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [budecosystem/boomer-1b](https://huggingface.co/budecosystem/boomer-1b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_budecosystem__boomer-1b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-09T15:37:37.200624](https://huggingface.co/datasets/open-llm-leaderboard/details_budecosystem__boomer-1b/blob/main/results_2023-10-09T15-37-37.200624.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25620109849009637,\n\
\ \"acc_stderr\": 0.03154777205475439,\n \"acc_norm\": 0.25711859054322794,\n\
\ \"acc_norm_stderr\": 0.031560681096818845,\n \"mc1\": 0.23990208078335373,\n\
\ \"mc1_stderr\": 0.014948812679062133,\n \"mc2\": 0.39172921157056684,\n\
\ \"mc2_stderr\": 0.014887410849881546\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.19539249146757678,\n \"acc_stderr\": 0.01158690718995291,\n\
\ \"acc_norm\": 0.22781569965870307,\n \"acc_norm_stderr\": 0.012256708602326909\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2940649273053177,\n\
\ \"acc_stderr\": 0.004546901132945138,\n \"acc_norm\": 0.3157737502489544,\n\
\ \"acc_norm_stderr\": 0.004638733202373881\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165044,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165044\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n\
\ \"acc_stderr\": 0.036333844140734636,\n \"acc_norm\": 0.22962962962962963,\n\
\ \"acc_norm_stderr\": 0.036333844140734636\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2236842105263158,\n \"acc_stderr\": 0.03391160934343604,\n\
\ \"acc_norm\": 0.2236842105263158,\n \"acc_norm_stderr\": 0.03391160934343604\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.23018867924528302,\n \"acc_stderr\": 0.025907897122408173,\n\
\ \"acc_norm\": 0.23018867924528302,\n \"acc_norm_stderr\": 0.025907897122408173\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\"\
: 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.044619604333847415\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2680851063829787,\n \"acc_stderr\": 0.028957342788342347,\n\
\ \"acc_norm\": 0.2680851063829787,\n \"acc_norm_stderr\": 0.028957342788342347\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2751322751322751,\n \"acc_stderr\": 0.023000086859068642,\n \"\
acc_norm\": 0.2751322751322751,\n \"acc_norm_stderr\": 0.023000086859068642\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.16666666666666666,\n\
\ \"acc_stderr\": 0.03333333333333337,\n \"acc_norm\": 0.16666666666666666,\n\
\ \"acc_norm_stderr\": 0.03333333333333337\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3258064516129032,\n\
\ \"acc_stderr\": 0.0266620105785671,\n \"acc_norm\": 0.3258064516129032,\n\
\ \"acc_norm_stderr\": 0.0266620105785671\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.0316185633535861,\n\
\ \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.0316185633535861\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3151515151515151,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.3151515151515151,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.31313131313131315,\n \"acc_stderr\": 0.033042050878136525,\n \"\
acc_norm\": 0.31313131313131315,\n \"acc_norm_stderr\": 0.033042050878136525\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3316062176165803,\n \"acc_stderr\": 0.03397636541089116,\n\
\ \"acc_norm\": 0.3316062176165803,\n \"acc_norm_stderr\": 0.03397636541089116\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.24358974358974358,\n \"acc_stderr\": 0.021763733684173923,\n\
\ \"acc_norm\": 0.24358974358974358,\n \"acc_norm_stderr\": 0.021763733684173923\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073828,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073828\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.026653531596715477,\n\
\ \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.026653531596715477\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.29541284403669726,\n \"acc_stderr\": 0.019560619182976,\n \"acc_norm\"\
: 0.29541284403669726,\n \"acc_norm_stderr\": 0.019560619182976\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4074074074074074,\n\
\ \"acc_stderr\": 0.03350991604696043,\n \"acc_norm\": 0.4074074074074074,\n\
\ \"acc_norm_stderr\": 0.03350991604696043\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.031321798030832904,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.031321798030832904\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2489451476793249,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.2489451476793249,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.18385650224215247,\n\
\ \"acc_stderr\": 0.02599837909235651,\n \"acc_norm\": 0.18385650224215247,\n\
\ \"acc_norm_stderr\": 0.02599837909235651\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.03915345408847837,\n\
\ \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.03915345408847837\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2644628099173554,\n \"acc_stderr\": 0.04026187527591204,\n \"\
acc_norm\": 0.2644628099173554,\n \"acc_norm_stderr\": 0.04026187527591204\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25766871165644173,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.25766871165644173,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.038946411200447915,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.038946411200447915\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.21359223300970873,\n \"acc_stderr\": 0.04058042015646034,\n\
\ \"acc_norm\": 0.21359223300970873,\n \"acc_norm_stderr\": 0.04058042015646034\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.21367521367521367,\n\
\ \"acc_stderr\": 0.026853450377009154,\n \"acc_norm\": 0.21367521367521367,\n\
\ \"acc_norm_stderr\": 0.026853450377009154\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2656449553001277,\n\
\ \"acc_stderr\": 0.01579430248788873,\n \"acc_norm\": 0.2656449553001277,\n\
\ \"acc_norm_stderr\": 0.01579430248788873\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.22254335260115607,\n \"acc_stderr\": 0.02239421566194282,\n\
\ \"acc_norm\": 0.22254335260115607,\n \"acc_norm_stderr\": 0.02239421566194282\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574929,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574929\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22875816993464052,\n \"acc_stderr\": 0.02405102973991225,\n\
\ \"acc_norm\": 0.22875816993464052,\n \"acc_norm_stderr\": 0.02405102973991225\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n\
\ \"acc_stderr\": 0.02600330111788513,\n \"acc_norm\": 0.2990353697749196,\n\
\ \"acc_norm_stderr\": 0.02600330111788513\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2808641975308642,\n \"acc_stderr\": 0.025006469755799208,\n\
\ \"acc_norm\": 0.2808641975308642,\n \"acc_norm_stderr\": 0.025006469755799208\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2730496453900709,\n \"acc_stderr\": 0.02657786094330786,\n \
\ \"acc_norm\": 0.2730496453900709,\n \"acc_norm_stderr\": 0.02657786094330786\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2470664928292047,\n\
\ \"acc_stderr\": 0.011015752255279338,\n \"acc_norm\": 0.2470664928292047,\n\
\ \"acc_norm_stderr\": 0.011015752255279338\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.029896163033125478,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.029896163033125478\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25326797385620914,\n \"acc_stderr\": 0.01759348689536683,\n \
\ \"acc_norm\": 0.25326797385620914,\n \"acc_norm_stderr\": 0.01759348689536683\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2636363636363636,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.2636363636363636,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.22857142857142856,\n \"acc_stderr\": 0.02688214492230774,\n\
\ \"acc_norm\": 0.22857142857142856,\n \"acc_norm_stderr\": 0.02688214492230774\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n\
\ \"acc_stderr\": 0.029929415408348384,\n \"acc_norm\": 0.23383084577114427,\n\
\ \"acc_norm_stderr\": 0.029929415408348384\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.18674698795180722,\n\
\ \"acc_stderr\": 0.030338749144500597,\n \"acc_norm\": 0.18674698795180722,\n\
\ \"acc_norm_stderr\": 0.030338749144500597\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.27485380116959063,\n \"acc_stderr\": 0.034240429246915824,\n\
\ \"acc_norm\": 0.27485380116959063,\n \"acc_norm_stderr\": 0.034240429246915824\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23990208078335373,\n\
\ \"mc1_stderr\": 0.014948812679062133,\n \"mc2\": 0.39172921157056684,\n\
\ \"mc2_stderr\": 0.014887410849881546\n }\n}\n```"
repo_url: https://huggingface.co/budecosystem/boomer-1b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|arc:challenge|25_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hellaswag|10_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T15-37-37.200624.parquet'
- config_name: results
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- results_2023-10-09T15-37-37.200624.parquet
- split: latest
path:
- results_2023-10-09T15-37-37.200624.parquet
---
# Dataset Card for Evaluation run of budecosystem/boomer-1b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/budecosystem/boomer-1b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [budecosystem/boomer-1b](https://huggingface.co/budecosystem/boomer-1b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_budecosystem__boomer-1b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-09T15:37:37.200624](https://huggingface.co/datasets/open-llm-leaderboard/details_budecosystem__boomer-1b/blob/main/results_2023-10-09T15-37-37.200624.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25620109849009637,
"acc_stderr": 0.03154777205475439,
"acc_norm": 0.25711859054322794,
"acc_norm_stderr": 0.031560681096818845,
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062133,
"mc2": 0.39172921157056684,
"mc2_stderr": 0.014887410849881546
},
"harness|arc:challenge|25": {
"acc": 0.19539249146757678,
"acc_stderr": 0.01158690718995291,
"acc_norm": 0.22781569965870307,
"acc_norm_stderr": 0.012256708602326909
},
"harness|hellaswag|10": {
"acc": 0.2940649273053177,
"acc_stderr": 0.004546901132945138,
"acc_norm": 0.3157737502489544,
"acc_norm_stderr": 0.004638733202373881
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.036333844140734636,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.036333844140734636
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2236842105263158,
"acc_stderr": 0.03391160934343604,
"acc_norm": 0.2236842105263158,
"acc_norm_stderr": 0.03391160934343604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.23018867924528302,
"acc_stderr": 0.025907897122408173,
"acc_norm": 0.23018867924528302,
"acc_norm_stderr": 0.025907897122408173
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2680851063829787,
"acc_stderr": 0.028957342788342347,
"acc_norm": 0.2680851063829787,
"acc_norm_stderr": 0.028957342788342347
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2751322751322751,
"acc_stderr": 0.023000086859068642,
"acc_norm": 0.2751322751322751,
"acc_norm_stderr": 0.023000086859068642
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03333333333333337,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03333333333333337
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3258064516129032,
"acc_stderr": 0.0266620105785671,
"acc_norm": 0.3258064516129032,
"acc_norm_stderr": 0.0266620105785671
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.0316185633535861,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.0316185633535861
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3151515151515151,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.3151515151515151,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.31313131313131315,
"acc_stderr": 0.033042050878136525,
"acc_norm": 0.31313131313131315,
"acc_norm_stderr": 0.033042050878136525
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3316062176165803,
"acc_stderr": 0.03397636541089116,
"acc_norm": 0.3316062176165803,
"acc_norm_stderr": 0.03397636541089116
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.021763733684173923,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.021763733684173923
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073828,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073828
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.026653531596715477,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.026653531596715477
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.29541284403669726,
"acc_stderr": 0.019560619182976,
"acc_norm": 0.29541284403669726,
"acc_norm_stderr": 0.019560619182976
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.03350991604696043,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.03350991604696043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.031321798030832904,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.031321798030832904
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2489451476793249,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.2489451476793249,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.18385650224215247,
"acc_stderr": 0.02599837909235651,
"acc_norm": 0.18385650224215247,
"acc_norm_stderr": 0.02599837909235651
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.03915345408847837,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.03915345408847837
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2644628099173554,
"acc_stderr": 0.04026187527591204,
"acc_norm": 0.2644628099173554,
"acc_norm_stderr": 0.04026187527591204
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25766871165644173,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.25766871165644173,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.038946411200447915,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.038946411200447915
},
"harness|hendrycksTest-management|5": {
"acc": 0.21359223300970873,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.21359223300970873,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.21367521367521367,
"acc_stderr": 0.026853450377009154,
"acc_norm": 0.21367521367521367,
"acc_norm_stderr": 0.026853450377009154
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2656449553001277,
"acc_stderr": 0.01579430248788873,
"acc_norm": 0.2656449553001277,
"acc_norm_stderr": 0.01579430248788873
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.22254335260115607,
"acc_stderr": 0.02239421566194282,
"acc_norm": 0.22254335260115607,
"acc_norm_stderr": 0.02239421566194282
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574929,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574929
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22875816993464052,
"acc_stderr": 0.02405102973991225,
"acc_norm": 0.22875816993464052,
"acc_norm_stderr": 0.02405102973991225
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2990353697749196,
"acc_stderr": 0.02600330111788513,
"acc_norm": 0.2990353697749196,
"acc_norm_stderr": 0.02600330111788513
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2808641975308642,
"acc_stderr": 0.025006469755799208,
"acc_norm": 0.2808641975308642,
"acc_norm_stderr": 0.025006469755799208
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2730496453900709,
"acc_stderr": 0.02657786094330786,
"acc_norm": 0.2730496453900709,
"acc_norm_stderr": 0.02657786094330786
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2470664928292047,
"acc_stderr": 0.011015752255279338,
"acc_norm": 0.2470664928292047,
"acc_norm_stderr": 0.011015752255279338
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.029896163033125478,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.029896163033125478
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25326797385620914,
"acc_stderr": 0.01759348689536683,
"acc_norm": 0.25326797385620914,
"acc_norm_stderr": 0.01759348689536683
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2636363636363636,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.2636363636363636,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22857142857142856,
"acc_stderr": 0.02688214492230774,
"acc_norm": 0.22857142857142856,
"acc_norm_stderr": 0.02688214492230774
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.029929415408348384,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.029929415408348384
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.18674698795180722,
"acc_stderr": 0.030338749144500597,
"acc_norm": 0.18674698795180722,
"acc_norm_stderr": 0.030338749144500597
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.27485380116959063,
"acc_stderr": 0.034240429246915824,
"acc_norm": 0.27485380116959063,
"acc_norm_stderr": 0.034240429246915824
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062133,
"mc2": 0.39172921157056684,
"mc2_stderr": 0.014887410849881546
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_caisarl76__Mistral-7B-OpenOrca-Guanaco-accu16 | 2023-10-09T15:54:34.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of caisarl76/Mistral-7B-OpenOrca-Guanaco-accu16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [caisarl76/Mistral-7B-OpenOrca-Guanaco-accu16](https://huggingface.co/caisarl76/Mistral-7B-OpenOrca-Guanaco-accu16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_caisarl76__Mistral-7B-OpenOrca-Guanaco-accu16\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-09T15:53:10.944584](https://huggingface.co/datasets/open-llm-leaderboard/details_caisarl76__Mistral-7B-OpenOrca-Guanaco-accu16/blob/main/results_2023-10-09T15-53-10.944584.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6124462933283554,\n\
\ \"acc_stderr\": 0.03340417591620888,\n \"acc_norm\": 0.6163514685001366,\n\
\ \"acc_norm_stderr\": 0.03338338917324356,\n \"mc1\": 0.3463892288861689,\n\
\ \"mc1_stderr\": 0.01665699710912514,\n \"mc2\": 0.508101159786776,\n\
\ \"mc2_stderr\": 0.015037778592388262\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5631399317406144,\n \"acc_stderr\": 0.014494421584256527,\n\
\ \"acc_norm\": 0.5972696245733788,\n \"acc_norm_stderr\": 0.014332236306790147\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6345349531965744,\n\
\ \"acc_stderr\": 0.0048057615138034126,\n \"acc_norm\": 0.8308105954989046,\n\
\ \"acc_norm_stderr\": 0.0037415289563158473\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n\
\ \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.0373362665538351,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.0373362665538351\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929775,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929775\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851102,\n \"\
acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851102\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7483870967741936,\n\
\ \"acc_stderr\": 0.024685979286239976,\n \"acc_norm\": 0.7483870967741936,\n\
\ \"acc_norm_stderr\": 0.024685979286239976\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.03515895551165698,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.03515895551165698\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790486,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790486\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.02541634309630644,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.02541634309630644\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6102564102564103,\n \"acc_stderr\": 0.024726967886647078,\n\
\ \"acc_norm\": 0.6102564102564103,\n \"acc_norm_stderr\": 0.024726967886647078\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473075,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.818348623853211,\n \"acc_stderr\": 0.016530617409266875,\n \"\
acc_norm\": 0.818348623853211,\n \"acc_norm_stderr\": 0.016530617409266875\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49074074074074076,\n \"acc_stderr\": 0.03409386946992699,\n \"\
acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.03409386946992699\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728742,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728742\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010213,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010213\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7841634738186463,\n\
\ \"acc_stderr\": 0.014711684386139953,\n \"acc_norm\": 0.7841634738186463,\n\
\ \"acc_norm_stderr\": 0.014711684386139953\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.02541600377316554,\n\
\ \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.02541600377316554\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2659217877094972,\n\
\ \"acc_stderr\": 0.014776765066438893,\n \"acc_norm\": 0.2659217877094972,\n\
\ \"acc_norm_stderr\": 0.014776765066438893\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279053,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279053\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818767,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818767\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6635802469135802,\n \"acc_stderr\": 0.02628973494595293,\n\
\ \"acc_norm\": 0.6635802469135802,\n \"acc_norm_stderr\": 0.02628973494595293\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666904,\n \
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666904\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4511082138200782,\n\
\ \"acc_stderr\": 0.012709037347346233,\n \"acc_norm\": 0.4511082138200782,\n\
\ \"acc_norm_stderr\": 0.012709037347346233\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5808823529411765,\n \"acc_stderr\": 0.02997280717046462,\n\
\ \"acc_norm\": 0.5808823529411765,\n \"acc_norm_stderr\": 0.02997280717046462\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6323529411764706,\n \"acc_stderr\": 0.019506291693954857,\n \
\ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.019506291693954857\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304328,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304328\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3463892288861689,\n\
\ \"mc1_stderr\": 0.01665699710912514,\n \"mc2\": 0.508101159786776,\n\
\ \"mc2_stderr\": 0.015037778592388262\n }\n}\n```"
repo_url: https://huggingface.co/caisarl76/Mistral-7B-OpenOrca-Guanaco-accu16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|arc:challenge|25_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hellaswag|10_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T15-53-10.944584.parquet'
- config_name: results
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- results_2023-10-09T15-53-10.944584.parquet
- split: latest
path:
- results_2023-10-09T15-53-10.944584.parquet
---
# Dataset Card for Evaluation run of caisarl76/Mistral-7B-OpenOrca-Guanaco-accu16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/caisarl76/Mistral-7B-OpenOrca-Guanaco-accu16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [caisarl76/Mistral-7B-OpenOrca-Guanaco-accu16](https://huggingface.co/caisarl76/Mistral-7B-OpenOrca-Guanaco-accu16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_caisarl76__Mistral-7B-OpenOrca-Guanaco-accu16",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-09T15:53:10.944584](https://huggingface.co/datasets/open-llm-leaderboard/details_caisarl76__Mistral-7B-OpenOrca-Guanaco-accu16/blob/main/results_2023-10-09T15-53-10.944584.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6124462933283554,
"acc_stderr": 0.03340417591620888,
"acc_norm": 0.6163514685001366,
"acc_norm_stderr": 0.03338338917324356,
"mc1": 0.3463892288861689,
"mc1_stderr": 0.01665699710912514,
"mc2": 0.508101159786776,
"mc2_stderr": 0.015037778592388262
},
"harness|arc:challenge|25": {
"acc": 0.5631399317406144,
"acc_stderr": 0.014494421584256527,
"acc_norm": 0.5972696245733788,
"acc_norm_stderr": 0.014332236306790147
},
"harness|hellaswag|10": {
"acc": 0.6345349531965744,
"acc_stderr": 0.0048057615138034126,
"acc_norm": 0.8308105954989046,
"acc_norm_stderr": 0.0037415289563158473
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.0373362665538351,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.0373362665538351
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929775,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929775
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.025138091388851102,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.025138091388851102
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.024685979286239976,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.024685979286239976
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.03515895551165698,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.03515895551165698
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.02541634309630644,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.02541634309630644
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6102564102564103,
"acc_stderr": 0.024726967886647078,
"acc_norm": 0.6102564102564103,
"acc_norm_stderr": 0.024726967886647078
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473075,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.818348623853211,
"acc_stderr": 0.016530617409266875,
"acc_norm": 0.818348623853211,
"acc_norm_stderr": 0.016530617409266875
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.03409386946992699,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.03409386946992699
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229143,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229143
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728742,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728742
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010213,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010213
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7841634738186463,
"acc_stderr": 0.014711684386139953,
"acc_norm": 0.7841634738186463,
"acc_norm_stderr": 0.014711684386139953
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.02541600377316554,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.02541600377316554
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2659217877094972,
"acc_stderr": 0.014776765066438893,
"acc_norm": 0.2659217877094972,
"acc_norm_stderr": 0.014776765066438893
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279053,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279053
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818767,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818767
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6635802469135802,
"acc_stderr": 0.02628973494595293,
"acc_norm": 0.6635802469135802,
"acc_norm_stderr": 0.02628973494595293
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666904,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4511082138200782,
"acc_stderr": 0.012709037347346233,
"acc_norm": 0.4511082138200782,
"acc_norm_stderr": 0.012709037347346233
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5808823529411765,
"acc_stderr": 0.02997280717046462,
"acc_norm": 0.5808823529411765,
"acc_norm_stderr": 0.02997280717046462
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.019506291693954857,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.019506291693954857
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304328,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304328
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3463892288861689,
"mc1_stderr": 0.01665699710912514,
"mc2": 0.508101159786776,
"mc2_stderr": 0.015037778592388262
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_nicholasKluge__Aira-1B5 | 2023-10-09T15:55:59.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of nicholasKluge/Aira-1B5
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nicholasKluge/Aira-1B5](https://huggingface.co/nicholasKluge/Aira-1B5) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nicholasKluge__Aira-1B5\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-09T15:54:46.926141](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-1B5/blob/main/results_2023-10-09T15-54-46.926141.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2743564373642291,\n\
\ \"acc_stderr\": 0.03211959266297477,\n \"acc_norm\": 0.27587655832273894,\n\
\ \"acc_norm_stderr\": 0.0321270755179637,\n \"mc1\": 0.2386780905752754,\n\
\ \"mc1_stderr\": 0.014922629695456418,\n \"mc2\": 0.4115839931034755,\n\
\ \"mc2_stderr\": 0.015541548311642976\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2687713310580205,\n \"acc_stderr\": 0.01295506596371069,\n\
\ \"acc_norm\": 0.28924914675767915,\n \"acc_norm_stderr\": 0.013250012579393443\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.36188010356502687,\n\
\ \"acc_stderr\": 0.004795622757327151,\n \"acc_norm\": 0.43108942441744674,\n\
\ \"acc_norm_stderr\": 0.004942164585991465\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.035914440841969694,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.035914440841969694\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.29605263157894735,\n \"acc_stderr\": 0.037150621549989056,\n\
\ \"acc_norm\": 0.29605263157894735,\n \"acc_norm_stderr\": 0.037150621549989056\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3018867924528302,\n \"acc_stderr\": 0.028254200344438665,\n\
\ \"acc_norm\": 0.3018867924528302,\n \"acc_norm_stderr\": 0.028254200344438665\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.32947976878612717,\n\
\ \"acc_stderr\": 0.03583901754736411,\n \"acc_norm\": 0.32947976878612717,\n\
\ \"acc_norm_stderr\": 0.03583901754736411\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082633,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082633\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.20851063829787234,\n \"acc_stderr\": 0.026556982117838728,\n\
\ \"acc_norm\": 0.20851063829787234,\n \"acc_norm_stderr\": 0.026556982117838728\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708604,\n \"\
acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708604\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24193548387096775,\n\
\ \"acc_stderr\": 0.0243625996930311,\n \"acc_norm\": 0.24193548387096775,\n\
\ \"acc_norm_stderr\": 0.0243625996930311\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.03161856335358609,\n\
\ \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.03161856335358609\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\"\
: 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.03287666758603489,\n\
\ \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.03287666758603489\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.35353535353535354,\n \"acc_stderr\": 0.03406086723547153,\n \"\
acc_norm\": 0.35353535353535354,\n \"acc_norm_stderr\": 0.03406086723547153\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n\
\ \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3641025641025641,\n \"acc_stderr\": 0.02439667298509477,\n \
\ \"acc_norm\": 0.3641025641025641,\n \"acc_norm_stderr\": 0.02439667298509477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3445378151260504,\n \"acc_stderr\": 0.030868682604121633,\n\
\ \"acc_norm\": 0.3445378151260504,\n \"acc_norm_stderr\": 0.030868682604121633\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3394495412844037,\n \"acc_stderr\": 0.02030210934266235,\n \"\
acc_norm\": 0.3394495412844037,\n \"acc_norm_stderr\": 0.02030210934266235\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2696078431372549,\n \"acc_stderr\": 0.03114557065948678,\n \"\
acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.03114557065948678\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.21518987341772153,\n \"acc_stderr\": 0.02675082699467617,\n \
\ \"acc_norm\": 0.21518987341772153,\n \"acc_norm_stderr\": 0.02675082699467617\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.13901345291479822,\n\
\ \"acc_stderr\": 0.023219352834474464,\n \"acc_norm\": 0.13901345291479822,\n\
\ \"acc_norm_stderr\": 0.023219352834474464\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.14049586776859505,\n \"acc_stderr\": 0.03172233426002161,\n \"\
acc_norm\": 0.14049586776859505,\n \"acc_norm_stderr\": 0.03172233426002161\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.032910995786157686,\n\
\ \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.032910995786157686\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\
\ \"acc_stderr\": 0.04059867246952688,\n \"acc_norm\": 0.24107142857142858,\n\
\ \"acc_norm_stderr\": 0.04059867246952688\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258972,\n\
\ \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258972\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.21367521367521367,\n\
\ \"acc_stderr\": 0.026853450377009154,\n \"acc_norm\": 0.21367521367521367,\n\
\ \"acc_norm_stderr\": 0.026853450377009154\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909281\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.20561941251596424,\n\
\ \"acc_stderr\": 0.014452500456785825,\n \"acc_norm\": 0.20561941251596424,\n\
\ \"acc_norm_stderr\": 0.014452500456785825\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2254335260115607,\n \"acc_stderr\": 0.022497230190967547,\n\
\ \"acc_norm\": 0.2254335260115607,\n \"acc_norm_stderr\": 0.022497230190967547\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2875816993464052,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.2875816993464052,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24115755627009647,\n\
\ \"acc_stderr\": 0.024296594034763426,\n \"acc_norm\": 0.24115755627009647,\n\
\ \"acc_norm_stderr\": 0.024296594034763426\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.22530864197530864,\n \"acc_stderr\": 0.023246202647819746,\n\
\ \"acc_norm\": 0.22530864197530864,\n \"acc_norm_stderr\": 0.023246202647819746\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24113475177304963,\n \"acc_stderr\": 0.025518731049537762,\n \
\ \"acc_norm\": 0.24113475177304963,\n \"acc_norm_stderr\": 0.025518731049537762\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23468057366362452,\n\
\ \"acc_stderr\": 0.010824026872449358,\n \"acc_norm\": 0.23468057366362452,\n\
\ \"acc_norm_stderr\": 0.010824026872449358\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.41544117647058826,\n \"acc_stderr\": 0.029935342707877757,\n\
\ \"acc_norm\": 0.41544117647058826,\n \"acc_norm_stderr\": 0.029935342707877757\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.23202614379084968,\n \"acc_stderr\": 0.01707737337785701,\n \
\ \"acc_norm\": 0.23202614379084968,\n \"acc_norm_stderr\": 0.01707737337785701\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.39591836734693875,\n \"acc_stderr\": 0.03130802899065686,\n\
\ \"acc_norm\": 0.39591836734693875,\n \"acc_norm_stderr\": 0.03130802899065686\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2736318407960199,\n\
\ \"acc_stderr\": 0.03152439186555402,\n \"acc_norm\": 0.2736318407960199,\n\
\ \"acc_norm_stderr\": 0.03152439186555402\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21686746987951808,\n\
\ \"acc_stderr\": 0.03208284450356365,\n \"acc_norm\": 0.21686746987951808,\n\
\ \"acc_norm_stderr\": 0.03208284450356365\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.17543859649122806,\n \"acc_stderr\": 0.029170885500727654,\n\
\ \"acc_norm\": 0.17543859649122806,\n \"acc_norm_stderr\": 0.029170885500727654\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2386780905752754,\n\
\ \"mc1_stderr\": 0.014922629695456418,\n \"mc2\": 0.4115839931034755,\n\
\ \"mc2_stderr\": 0.015541548311642976\n }\n}\n```"
repo_url: https://huggingface.co/nicholasKluge/Aira-1B5
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|arc:challenge|25_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hellaswag|10_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T15-54-46.926141.parquet'
- config_name: results
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- results_2023-10-09T15-54-46.926141.parquet
- split: latest
path:
- results_2023-10-09T15-54-46.926141.parquet
---
# Dataset Card for Evaluation run of nicholasKluge/Aira-1B5
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/nicholasKluge/Aira-1B5
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [nicholasKluge/Aira-1B5](https://huggingface.co/nicholasKluge/Aira-1B5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nicholasKluge__Aira-1B5",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-09T15:54:46.926141](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-1B5/blob/main/results_2023-10-09T15-54-46.926141.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2743564373642291,
"acc_stderr": 0.03211959266297477,
"acc_norm": 0.27587655832273894,
"acc_norm_stderr": 0.0321270755179637,
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456418,
"mc2": 0.4115839931034755,
"mc2_stderr": 0.015541548311642976
},
"harness|arc:challenge|25": {
"acc": 0.2687713310580205,
"acc_stderr": 0.01295506596371069,
"acc_norm": 0.28924914675767915,
"acc_norm_stderr": 0.013250012579393443
},
"harness|hellaswag|10": {
"acc": 0.36188010356502687,
"acc_stderr": 0.004795622757327151,
"acc_norm": 0.43108942441744674,
"acc_norm_stderr": 0.004942164585991465
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.035914440841969694,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.035914440841969694
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.29605263157894735,
"acc_stderr": 0.037150621549989056,
"acc_norm": 0.29605263157894735,
"acc_norm_stderr": 0.037150621549989056
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3018867924528302,
"acc_stderr": 0.028254200344438665,
"acc_norm": 0.3018867924528302,
"acc_norm_stderr": 0.028254200344438665
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.32947976878612717,
"acc_stderr": 0.03583901754736411,
"acc_norm": 0.32947976878612717,
"acc_norm_stderr": 0.03583901754736411
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082633,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082633
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20851063829787234,
"acc_stderr": 0.026556982117838728,
"acc_norm": 0.20851063829787234,
"acc_norm_stderr": 0.026556982117838728
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708604,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708604
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24193548387096775,
"acc_stderr": 0.0243625996930311,
"acc_norm": 0.24193548387096775,
"acc_norm_stderr": 0.0243625996930311
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.03161856335358609,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.03161856335358609
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.03287666758603489,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.03287666758603489
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35353535353535354,
"acc_stderr": 0.03406086723547153,
"acc_norm": 0.35353535353535354,
"acc_norm_stderr": 0.03406086723547153
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.36787564766839376,
"acc_stderr": 0.03480175668466036,
"acc_norm": 0.36787564766839376,
"acc_norm_stderr": 0.03480175668466036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3641025641025641,
"acc_stderr": 0.02439667298509477,
"acc_norm": 0.3641025641025641,
"acc_norm_stderr": 0.02439667298509477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3445378151260504,
"acc_stderr": 0.030868682604121633,
"acc_norm": 0.3445378151260504,
"acc_norm_stderr": 0.030868682604121633
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3394495412844037,
"acc_stderr": 0.02030210934266235,
"acc_norm": 0.3394495412844037,
"acc_norm_stderr": 0.02030210934266235
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.03114557065948678,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.03114557065948678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.21518987341772153,
"acc_stderr": 0.02675082699467617,
"acc_norm": 0.21518987341772153,
"acc_norm_stderr": 0.02675082699467617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.13901345291479822,
"acc_stderr": 0.023219352834474464,
"acc_norm": 0.13901345291479822,
"acc_norm_stderr": 0.023219352834474464
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2824427480916031,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.2824427480916031,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.14049586776859505,
"acc_stderr": 0.03172233426002161,
"acc_norm": 0.14049586776859505,
"acc_norm_stderr": 0.03172233426002161
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952688,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952688
},
"harness|hendrycksTest-management|5": {
"acc": 0.3786407766990291,
"acc_stderr": 0.04802694698258972,
"acc_norm": 0.3786407766990291,
"acc_norm_stderr": 0.04802694698258972
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.21367521367521367,
"acc_stderr": 0.026853450377009154,
"acc_norm": 0.21367521367521367,
"acc_norm_stderr": 0.026853450377009154
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.20561941251596424,
"acc_stderr": 0.014452500456785825,
"acc_norm": 0.20561941251596424,
"acc_norm_stderr": 0.014452500456785825
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.022497230190967547,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.022497230190967547
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249588,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2875816993464052,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.2875816993464052,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24115755627009647,
"acc_stderr": 0.024296594034763426,
"acc_norm": 0.24115755627009647,
"acc_norm_stderr": 0.024296594034763426
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22530864197530864,
"acc_stderr": 0.023246202647819746,
"acc_norm": 0.22530864197530864,
"acc_norm_stderr": 0.023246202647819746
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24113475177304963,
"acc_stderr": 0.025518731049537762,
"acc_norm": 0.24113475177304963,
"acc_norm_stderr": 0.025518731049537762
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23468057366362452,
"acc_stderr": 0.010824026872449358,
"acc_norm": 0.23468057366362452,
"acc_norm_stderr": 0.010824026872449358
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.41544117647058826,
"acc_stderr": 0.029935342707877757,
"acc_norm": 0.41544117647058826,
"acc_norm_stderr": 0.029935342707877757
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.23202614379084968,
"acc_stderr": 0.01707737337785701,
"acc_norm": 0.23202614379084968,
"acc_norm_stderr": 0.01707737337785701
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.39591836734693875,
"acc_stderr": 0.03130802899065686,
"acc_norm": 0.39591836734693875,
"acc_norm_stderr": 0.03130802899065686
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2736318407960199,
"acc_stderr": 0.03152439186555402,
"acc_norm": 0.2736318407960199,
"acc_norm_stderr": 0.03152439186555402
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21686746987951808,
"acc_stderr": 0.03208284450356365,
"acc_norm": 0.21686746987951808,
"acc_norm_stderr": 0.03208284450356365
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.17543859649122806,
"acc_stderr": 0.029170885500727654,
"acc_norm": 0.17543859649122806,
"acc_norm_stderr": 0.029170885500727654
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456418,
"mc2": 0.4115839931034755,
"mc2_stderr": 0.015541548311642976
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_caisarl76__Mistral-7B-guanaco1k-ep2 | 2023-10-09T15:59:19.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of caisarl76/Mistral-7B-guanaco1k-ep2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [caisarl76/Mistral-7B-guanaco1k-ep2](https://huggingface.co/caisarl76/Mistral-7B-guanaco1k-ep2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_caisarl76__Mistral-7B-guanaco1k-ep2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-09T15:57:53.203212](https://huggingface.co/datasets/open-llm-leaderboard/details_caisarl76__Mistral-7B-guanaco1k-ep2/blob/main/results_2023-10-09T15-57-53.203212.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6143042166493967,\n\
\ \"acc_stderr\": 0.03338971893001228,\n \"acc_norm\": 0.618336439639903,\n\
\ \"acc_norm_stderr\": 0.03336872879395508,\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.01685096106172011,\n \"mc2\": 0.543989419660622,\n\
\ \"mc2_stderr\": 0.014697499688148498\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5631399317406144,\n \"acc_stderr\": 0.014494421584256515,\n\
\ \"acc_norm\": 0.6006825938566553,\n \"acc_norm_stderr\": 0.014312094557946705\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6272654849631547,\n\
\ \"acc_stderr\": 0.004825441080261185,\n \"acc_norm\": 0.8276239792869946,\n\
\ \"acc_norm_stderr\": 0.0037693500791958923\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6566037735849056,\n \"acc_stderr\": 0.02922452646912479,\n\
\ \"acc_norm\": 0.6566037735849056,\n \"acc_norm_stderr\": 0.02922452646912479\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n\
\ \"acc_stderr\": 0.03772446857518027,\n \"acc_norm\": 0.5722543352601156,\n\
\ \"acc_norm_stderr\": 0.03772446857518027\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929777,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929777\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.03260038511835771,\n\
\ \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.03260038511835771\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.025107425481137285,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.025107425481137285\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7354838709677419,\n\
\ \"acc_stderr\": 0.02509189237885928,\n \"acc_norm\": 0.7354838709677419,\n\
\ \"acc_norm_stderr\": 0.02509189237885928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338642,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338642\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.02614848346915331,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.02614848346915331\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.024666744915187208,\n\
\ \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.024666744915187208\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7963302752293578,\n \"acc_stderr\": 0.017266742087630783,\n \"\
acc_norm\": 0.7963302752293578,\n \"acc_norm_stderr\": 0.017266742087630783\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.034076320938540516,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.034076320938540516\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615771,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7943805874840357,\n\
\ \"acc_stderr\": 0.01445250045678583,\n \"acc_norm\": 0.7943805874840357,\n\
\ \"acc_norm_stderr\": 0.01445250045678583\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803824,\n\
\ \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803824\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2659217877094972,\n\
\ \"acc_stderr\": 0.01477676506643889,\n \"acc_norm\": 0.2659217877094972,\n\
\ \"acc_norm_stderr\": 0.01477676506643889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.026256053835718964,\n\
\ \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.026256053835718964\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.02492200116888633,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.02492200116888633\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4282920469361147,\n\
\ \"acc_stderr\": 0.012638223880313168,\n \"acc_norm\": 0.4282920469361147,\n\
\ \"acc_norm_stderr\": 0.012638223880313168\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.029029422815681397,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.029029422815681397\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6339869281045751,\n \"acc_stderr\": 0.01948802574552967,\n \
\ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.01948802574552967\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712846,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712846\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263734,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263734\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.03061111655743253,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.03061111655743253\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.01685096106172011,\n \"mc2\": 0.543989419660622,\n\
\ \"mc2_stderr\": 0.014697499688148498\n }\n}\n```"
repo_url: https://huggingface.co/caisarl76/Mistral-7B-guanaco1k-ep2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|arc:challenge|25_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hellaswag|10_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T15-57-53.203212.parquet'
- config_name: results
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- results_2023-10-09T15-57-53.203212.parquet
- split: latest
path:
- results_2023-10-09T15-57-53.203212.parquet
---
# Dataset Card for Evaluation run of caisarl76/Mistral-7B-guanaco1k-ep2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/caisarl76/Mistral-7B-guanaco1k-ep2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [caisarl76/Mistral-7B-guanaco1k-ep2](https://huggingface.co/caisarl76/Mistral-7B-guanaco1k-ep2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_caisarl76__Mistral-7B-guanaco1k-ep2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-09T15:57:53.203212](https://huggingface.co/datasets/open-llm-leaderboard/details_caisarl76__Mistral-7B-guanaco1k-ep2/blob/main/results_2023-10-09T15-57-53.203212.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6143042166493967,
"acc_stderr": 0.03338971893001228,
"acc_norm": 0.618336439639903,
"acc_norm_stderr": 0.03336872879395508,
"mc1": 0.36474908200734396,
"mc1_stderr": 0.01685096106172011,
"mc2": 0.543989419660622,
"mc2_stderr": 0.014697499688148498
},
"harness|arc:challenge|25": {
"acc": 0.5631399317406144,
"acc_stderr": 0.014494421584256515,
"acc_norm": 0.6006825938566553,
"acc_norm_stderr": 0.014312094557946705
},
"harness|hellaswag|10": {
"acc": 0.6272654849631547,
"acc_stderr": 0.004825441080261185,
"acc_norm": 0.8276239792869946,
"acc_norm_stderr": 0.0037693500791958923
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6566037735849056,
"acc_stderr": 0.02922452646912479,
"acc_norm": 0.6566037735849056,
"acc_norm_stderr": 0.02922452646912479
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.03772446857518027,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.03772446857518027
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929777,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.03260038511835771,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.03260038511835771
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.025107425481137285,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.025107425481137285
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7354838709677419,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.7354838709677419,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338642,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338642
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.02614848346915331,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.02614848346915331
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.024666744915187208,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.024666744915187208
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7963302752293578,
"acc_stderr": 0.017266742087630783,
"acc_norm": 0.7963302752293578,
"acc_norm_stderr": 0.017266742087630783
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615771,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7943805874840357,
"acc_stderr": 0.01445250045678583,
"acc_norm": 0.7943805874840357,
"acc_norm_stderr": 0.01445250045678583
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803824,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803824
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2659217877094972,
"acc_stderr": 0.01477676506643889,
"acc_norm": 0.2659217877094972,
"acc_norm_stderr": 0.01477676506643889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.026256053835718964,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.026256053835718964
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.02492200116888633,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.02492200116888633
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.029736592526424438,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.029736592526424438
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4282920469361147,
"acc_stderr": 0.012638223880313168,
"acc_norm": 0.4282920469361147,
"acc_norm_stderr": 0.012638223880313168
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.029029422815681397,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.029029422815681397
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.01948802574552967,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.01948802574552967
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.02853556033712846,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.02853556033712846
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263734,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263734
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.03061111655743253,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.03061111655743253
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36474908200734396,
"mc1_stderr": 0.01685096106172011,
"mc2": 0.543989419660622,
"mc2_stderr": 0.014697499688148498
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Autoceres/Agricorp | 2023-10-09T16:31:40.000Z | [
"region:us"
] | Autoceres | null | null | null | 0 | 0 | Agricorp Dataset
The AutoCeres dataset comprises a collection of images captured from various sources and cultivation locations. It encompasses the following crops:
Corn
Soybean
Rice
Onion
Each crop category is associated with a set of images, and for further analysis and segmentation tasks, masks corresponding to these crops are also included. This dataset serves as a valuable resource for the development and training of computer vision algorithms in the agricultural domain. |
s-lab/images | 2023-10-09T16:39:46.000Z | [
"region:us"
] | s-lab | null | null | null | 0 | 0 | Entry not found |
SaiGirish/ddpm-butterflies-128 | 2023-10-09T16:06:39.000Z | [
"region:us"
] | SaiGirish | null | null | null | 0 | 0 | Entry not found |
JennyZZZ/guanaco-llama2-1k | 2023-10-09T20:48:00.000Z | [
"region:us"
] | JennyZZZ | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 15401731
num_examples: 9846
- name: test
num_bytes: 815439
num_examples: 518
download_size: 0
dataset_size: 16217170
---
# Dataset Card for "guanaco-llama2-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Longhui98/Graph_LLM | 2023-10-09T16:35:02.000Z | [
"license:apache-2.0",
"region:us"
] | Longhui98 | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
jacobcdahlke/lora_test | 2023-10-09T16:35:55.000Z | [
"license:openrail",
"region:us"
] | jacobcdahlke | null | null | null | 0 | 0 | ---
license: openrail
---
|
Amarjitkr/med | 2023-10-09T17:11:43.000Z | [
"license:apache-2.0",
"region:us"
] | Amarjitkr | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
ostapeno/ds | 2023-10-09T16:54:04.000Z | [
"region:us"
] | ostapeno | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: subject
dtype: string
- name: response
dtype: string
- name: author_instr
dtype: string
- name: inst_index_for_context
dtype: 'null'
- name: author_response
dtype: string
- name: normalized_cumul_logprob_response
dtype: float64
splits:
- name: train
num_bytes: 324551830
num_examples: 78955
download_size: 94548965
dataset_size: 324551830
---
# Dataset Card for "ds"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bofuchen/meimei | 2023-10-09T16:53:19.000Z | [
"region:us"
] | bofuchen | null | null | null | 0 | 0 | Entry not found |
dreamproit/bill_text_us | 2023-10-09T17:02:16.000Z | [
"license:mit",
"region:us"
] | dreamproit | null | null | null | 0 | 0 | ---
license: mit
---
|
mariosasko/single_commit_large_dataset | 2023-10-09T17:38:54.000Z | [
"region:us"
] | mariosasko | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: bytes
dtype: binary
splits:
- name: train
num_bytes: 55400000000
num_examples: 50000000
download_size: 59017818453
dataset_size: 55400000000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
SohamNale/Banking_Dataset_for_LLM_Finetuning | 2023-10-09T17:05:59.000Z | [
"region:us"
] | SohamNale | null | null | null | 0 | 0 | Entry not found |
wolfgang1717/DIP | 2023-10-09T17:06:21.000Z | [
"license:apache-2.0",
"region:us"
] | wolfgang1717 | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
foureyednymph/1111 | 2023-10-09T17:13:15.000Z | [
"region:us"
] | foureyednymph | null | null | null | 0 | 0 | Entry not found |
sleepyboyeyes/Acoustic | 2023-10-09T20:53:58.000Z | [
"region:us"
] | sleepyboyeyes | null | null | null | 0 | 0 | Entry not found |
nguyenthanhdo/patent_v2 | 2023-10-09T17:30:09.000Z | [
"region:us"
] | nguyenthanhdo | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: lang
dtype: string
- name: prompt
dtype: string
- name: prompt_len
dtype: int64
- name: source
dtype: string
splits:
- name: train
num_bytes: 255183131
num_examples: 100488
download_size: 130363196
dataset_size: 255183131
---
# Dataset Card for "patent_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bjoernp/random_captions_mistral | 2023-10-11T01:21:59.000Z | [
"region:us"
] | bjoernp | null | null | null | 0 | 0 | Entry not found |
autoevaluate/autoeval-eval-acronym_identification-default-e52b53-94025145972 | 2023-10-09T17:39:37.000Z | [
"region:us"
] | autoevaluate | null | null | null | 0 | 0 | Entry not found |
peter-h-o-r-v/autocast-initiative | 2023-10-09T18:31:55.000Z | [
"license:artistic-2.0",
"art",
"sound",
"podcast",
"podcasting",
"region:us"
] | peter-h-o-r-v | null | null | null | 0 | 0 | ---
license: artistic-2.0
pretty_name: The Autocast Initiative
tags:
- art
- sound
- podcast
- podcasting
---
# The Autocast Initiative
This dataset archives podcasts in real-time. Podcasts that indentify with the principle of autocasting as their method for sharing audiofiles with an audience of subsribers.
All contributers are volonteers.
## The Principles Autocasting
* The content is primarily not created.
* Neither the files or the RSS feed is not manipulated after publish, other than to correct mistakes.
* * The "episode description" is the exception to the above. Use this field however you please.
* No method is to be considered "too low-effort" when it comes to generating audiofiles.
* For content protected by monetization is encouraged to commit scrambled content and provide means for unscrambling as they see fit.
* * Further monetization is encouraged.
* Get paid if you can.
## How to contribute
Create a folder for your autocast as so:
```
/archive/[Name of your feed]/
```
Do not substitute special characters (if possible)
In this folder, include your episodes as well as snapshots of your RSS feed at the time of publish (if possible)
```
/archive/[Name of your feed]/[001].mp3 // or whichever format you use
/archive/[Name of your feed]/[001].xml
/archive/[Name of your feed]/[002].mp3 // ...
/archive/[Name of your feed]/[002].xml
...
/archive/[Name of your feed]/[00n].mp3 // ...
/archive/[Name of your feed]/[00n].xml
...
```
If you intend to publish more than 1000 episodes in a single feed, figure it out (responsibly) |
nguyenthanhdo/patent_v2_merged | 2023-10-09T17:53:51.000Z | [
"region:us"
] | nguyenthanhdo | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: lang
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 118735189
num_examples: 100488
download_size: 66085340
dataset_size: 118735189
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "patent_v2_merged"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
byrushrafa/size_objetc_indoor | 2023-10-09T17:57:57.000Z | [
"license:mit",
"region:us"
] | byrushrafa | null | null | null | 0 | 0 | ---
license: mit
---
|
edoramtej/edoramtej_testing_01 | 2023-10-09T20:15:49.000Z | [
"size_categories:n<1K",
"region:us"
] | edoramtej | null | null | null | 0 | 0 | ---
pretty_name: testing_01
size_categories:
- n<1K
--- |
ostapeno/wiki_platypus_inverse_mmlu_icl5_cleaned | 2023-10-10T12:55:42.000Z | [
"license:apache-2.0",
"region:us"
] | ostapeno | null | null | null | 0 | 0 | ---
license: apache-2.0
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: subject
dtype: string
- name: response
dtype: string
- name: author_instr
dtype: string
- name: inst_index_for_context
dtype: 'null'
- name: author_response
dtype: string
- name: normalized_cumul_logprob_response
dtype: float64
splits:
- name: formal_logic
num_bytes: 17350810.897726554
num_examples: 4221
- name: machine_learning
num_bytes: 22896000.165917296
num_examples: 5570
- name: global_facts
num_bytes: 22271190.107529607
num_examples: 5418
- name: abstract_algebra
num_bytes: 16845208.02153125
num_examples: 4098
- name: high_school_physics
num_bytes: 33747964.337914005
num_examples: 8210
- name: college_biology
num_bytes: 31372041.879045025
num_examples: 7632
- name: high_school_government_and_politics
num_bytes: 40279695.80355899
num_examples: 9799
- name: prehistory
num_bytes: 54666769.51643341
num_examples: 13299
- name: security_studies
num_bytes: 44772573.3944652
num_examples: 10892
- name: sociology
num_bytes: 40349575.87587866
num_examples: 9816
download_size: 96889177
dataset_size: 324551830.0
---
|
hmao/rule_gen_splunk | 2023-10-09T18:43:48.000Z | [
"region:us"
] | hmao | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: instruction
dtype: 'null'
- name: rule
dtype: 'null'
- name: software
dtype: 'null'
- name: configuration
dtype: 'null'
- name: description
dtype: 'null'
splits:
- name: train
num_bytes: 0
num_examples: 0
download_size: 1376
dataset_size: 0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "rule_gen_splunk"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sudeepshouche/custom_summarizer | 2023-10-09T18:47:32.000Z | [
"license:apache-2.0",
"region:us"
] | sudeepshouche | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_v2ray__LLaMA-2-Jannie-70B-QLoRA | 2023-10-09T18:55:57.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of v2ray/LLaMA-2-Jannie-70B-QLoRA
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [v2ray/LLaMA-2-Jannie-70B-QLoRA](https://huggingface.co/v2ray/LLaMA-2-Jannie-70B-QLoRA)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_v2ray__LLaMA-2-Jannie-70B-QLoRA\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-09T18:55:45.725131](https://huggingface.co/datasets/open-llm-leaderboard/details_v2ray__LLaMA-2-Jannie-70B-QLoRA/blob/main/results_2023-10-09T18-55-45.725131.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.5506501677852349,\n\
\ \"em_stderr\": 0.0050941277409732805,\n \"f1\": 0.5974674916107394,\n\
\ \"f1_stderr\": 0.004813528422862131,\n \"acc\": 0.5735917227001633,\n\
\ \"acc_stderr\": 0.011696543872157381\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.5506501677852349,\n \"em_stderr\": 0.0050941277409732805,\n\
\ \"f1\": 0.5974674916107394,\n \"f1_stderr\": 0.004813528422862131\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.31766489764973466,\n \
\ \"acc_stderr\": 0.012824066621488854\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.829518547750592,\n \"acc_stderr\": 0.010569021122825909\n\
\ }\n}\n```"
repo_url: https://huggingface.co/v2ray/LLaMA-2-Jannie-70B-QLoRA
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_09T18_55_45.725131
path:
- '**/details_harness|drop|3_2023-10-09T18-55-45.725131.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-09T18-55-45.725131.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_09T18_55_45.725131
path:
- '**/details_harness|gsm8k|5_2023-10-09T18-55-45.725131.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-09T18-55-45.725131.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_09T18_55_45.725131
path:
- '**/details_harness|winogrande|5_2023-10-09T18-55-45.725131.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-09T18-55-45.725131.parquet'
- config_name: results
data_files:
- split: 2023_10_09T18_55_45.725131
path:
- results_2023-10-09T18-55-45.725131.parquet
- split: latest
path:
- results_2023-10-09T18-55-45.725131.parquet
---
# Dataset Card for Evaluation run of v2ray/LLaMA-2-Jannie-70B-QLoRA
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/v2ray/LLaMA-2-Jannie-70B-QLoRA
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [v2ray/LLaMA-2-Jannie-70B-QLoRA](https://huggingface.co/v2ray/LLaMA-2-Jannie-70B-QLoRA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_v2ray__LLaMA-2-Jannie-70B-QLoRA",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-09T18:55:45.725131](https://huggingface.co/datasets/open-llm-leaderboard/details_v2ray__LLaMA-2-Jannie-70B-QLoRA/blob/main/results_2023-10-09T18-55-45.725131.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.5506501677852349,
"em_stderr": 0.0050941277409732805,
"f1": 0.5974674916107394,
"f1_stderr": 0.004813528422862131,
"acc": 0.5735917227001633,
"acc_stderr": 0.011696543872157381
},
"harness|drop|3": {
"em": 0.5506501677852349,
"em_stderr": 0.0050941277409732805,
"f1": 0.5974674916107394,
"f1_stderr": 0.004813528422862131
},
"harness|gsm8k|5": {
"acc": 0.31766489764973466,
"acc_stderr": 0.012824066621488854
},
"harness|winogrande|5": {
"acc": 0.829518547750592,
"acc_stderr": 0.010569021122825909
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
jony4583/ddetr-cvpdl | 2023-10-09T19:16:31.000Z | [
"region:us"
] | jony4583 | null | null | null | 0 | 0 | Entry not found |
Sharka/savve_dsjs | 2023-10-09T19:33:02.000Z | [
"region:us"
] | Sharka | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_bhenrym14__mistral-7b-platypus-fp16 | 2023-10-09T19:23:37.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of bhenrym14/mistral-7b-platypus-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bhenrym14/mistral-7b-platypus-fp16](https://huggingface.co/bhenrym14/mistral-7b-platypus-fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bhenrym14__mistral-7b-platypus-fp16\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-09T19:22:13.143311](https://huggingface.co/datasets/open-llm-leaderboard/details_bhenrym14__mistral-7b-platypus-fp16/blob/main/results_2023-10-09T19-22-13.143311.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6402308035116019,\n\
\ \"acc_stderr\": 0.032911890596483966,\n \"acc_norm\": 0.6443447334284652,\n\
\ \"acc_norm_stderr\": 0.03288781491320835,\n \"mc1\": 0.30354957160342716,\n\
\ \"mc1_stderr\": 0.016095884155386854,\n \"mc2\": 0.4507090146368127,\n\
\ \"mc2_stderr\": 0.01467858507646839\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5878839590443686,\n \"acc_stderr\": 0.014383915302225405,\n\
\ \"acc_norm\": 0.6305460750853242,\n \"acc_norm_stderr\": 0.014104578366491892\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6414060944035053,\n\
\ \"acc_stderr\": 0.0047860751075721845,\n \"acc_norm\": 0.8414658434574785,\n\
\ \"acc_norm_stderr\": 0.0036449467300446107\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03782728980865469,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03782728980865469\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.02906722014664483,\n\
\ \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.02906722014664483\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.025487187147859372,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.025487187147859372\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7612903225806451,\n \"acc_stderr\": 0.024251071262208837,\n \"\
acc_norm\": 0.7612903225806451,\n \"acc_norm_stderr\": 0.024251071262208837\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \
\ \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652458,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652458\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119994,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119994\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8275229357798165,\n \"acc_stderr\": 0.016197807956848047,\n \"\
acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.016197807956848047\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.034076320938540496,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.034076320938540496\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.02862654791243741,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.02862654791243741\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467617,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467617\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159464,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159464\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.035208939510976534,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.035208939510976534\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077823,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077823\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n\
\ \"acc_stderr\": 0.013816335389973131,\n \"acc_norm\": 0.8173690932311622,\n\
\ \"acc_norm_stderr\": 0.013816335389973131\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39329608938547483,\n\
\ \"acc_stderr\": 0.016337268694270102,\n \"acc_norm\": 0.39329608938547483,\n\
\ \"acc_norm_stderr\": 0.016337268694270102\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7712418300653595,\n \"acc_stderr\": 0.024051029739912248,\n\
\ \"acc_norm\": 0.7712418300653595,\n \"acc_norm_stderr\": 0.024051029739912248\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4574468085106383,\n \"acc_stderr\": 0.02971928127223684,\n \
\ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.02971928127223684\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47783572359843546,\n\
\ \"acc_stderr\": 0.012757683047716175,\n \"acc_norm\": 0.47783572359843546,\n\
\ \"acc_norm_stderr\": 0.012757683047716175\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.684640522875817,\n \"acc_stderr\": 0.01879808628488689,\n \
\ \"acc_norm\": 0.684640522875817,\n \"acc_norm_stderr\": 0.01879808628488689\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30354957160342716,\n\
\ \"mc1_stderr\": 0.016095884155386854,\n \"mc2\": 0.4507090146368127,\n\
\ \"mc2_stderr\": 0.01467858507646839\n }\n}\n```"
repo_url: https://huggingface.co/bhenrym14/mistral-7b-platypus-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|arc:challenge|25_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hellaswag|10_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T19-22-13.143311.parquet'
- config_name: results
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- results_2023-10-09T19-22-13.143311.parquet
- split: latest
path:
- results_2023-10-09T19-22-13.143311.parquet
---
# Dataset Card for Evaluation run of bhenrym14/mistral-7b-platypus-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/bhenrym14/mistral-7b-platypus-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [bhenrym14/mistral-7b-platypus-fp16](https://huggingface.co/bhenrym14/mistral-7b-platypus-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bhenrym14__mistral-7b-platypus-fp16",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-09T19:22:13.143311](https://huggingface.co/datasets/open-llm-leaderboard/details_bhenrym14__mistral-7b-platypus-fp16/blob/main/results_2023-10-09T19-22-13.143311.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6402308035116019,
"acc_stderr": 0.032911890596483966,
"acc_norm": 0.6443447334284652,
"acc_norm_stderr": 0.03288781491320835,
"mc1": 0.30354957160342716,
"mc1_stderr": 0.016095884155386854,
"mc2": 0.4507090146368127,
"mc2_stderr": 0.01467858507646839
},
"harness|arc:challenge|25": {
"acc": 0.5878839590443686,
"acc_stderr": 0.014383915302225405,
"acc_norm": 0.6305460750853242,
"acc_norm_stderr": 0.014104578366491892
},
"harness|hellaswag|10": {
"acc": 0.6414060944035053,
"acc_stderr": 0.0047860751075721845,
"acc_norm": 0.8414658434574785,
"acc_norm_stderr": 0.0036449467300446107
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03782728980865469,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03782728980865469
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6641509433962264,
"acc_stderr": 0.02906722014664483,
"acc_norm": 0.6641509433962264,
"acc_norm_stderr": 0.02906722014664483
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.025487187147859372,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.025487187147859372
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.024251071262208837,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.024251071262208837
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.02416278028401772,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.02416278028401772
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652458,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652458
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119994,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119994
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.016197807956848047,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.016197807956848047
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.034076320938540496,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.034076320938540496
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.02862654791243741,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.02862654791243741
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467617,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.035208939510976534,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.035208939510976534
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077823,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077823
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973131,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973131
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39329608938547483,
"acc_stderr": 0.016337268694270102,
"acc_norm": 0.39329608938547483,
"acc_norm_stderr": 0.016337268694270102
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7712418300653595,
"acc_stderr": 0.024051029739912248,
"acc_norm": 0.7712418300653595,
"acc_norm_stderr": 0.024051029739912248
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600713,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600713
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.02971928127223684,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.02971928127223684
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47783572359843546,
"acc_stderr": 0.012757683047716175,
"acc_norm": 0.47783572359843546,
"acc_norm_stderr": 0.012757683047716175
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.684640522875817,
"acc_stderr": 0.01879808628488689,
"acc_norm": 0.684640522875817,
"acc_norm_stderr": 0.01879808628488689
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.02879518557429129,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.02879518557429129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30354957160342716,
"mc1_stderr": 0.016095884155386854,
"mc2": 0.4507090146368127,
"mc2_stderr": 0.01467858507646839
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
nuph/nuph-sft | 2023-10-09T19:23:04.000Z | [
"region:us"
] | nuph | null | null | null | 0 | 0 | Entry not found |
muryshev/saiga-chat | 2023-10-10T16:53:01.000Z | [
"region:us"
] | muryshev | null | null | null | 0 | 0 | Entry not found |
ContextualAI/tiny-trivia_qa | 2023-10-09T19:42:17.000Z | [
"region:us"
] | ContextualAI | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: target
dtype: string
- name: query
dtype: string
- name: gold_generation
sequence: string
splits:
- name: dev
num_bytes: 34332
num_examples: 100
download_size: 24000
dataset_size: 34332
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
---
# Dataset Card for "tiny-trivia_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ContextualAI/tiny-nq_open | 2023-10-09T19:42:33.000Z | [
"region:us"
] | ContextualAI | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: query
dtype: string
- name: gold_generation
sequence: string
splits:
- name: dev
num_bytes: 7565
num_examples: 100
download_size: 7451
dataset_size: 7565
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
---
# Dataset Card for "tiny-nq_open"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sharka/CIVQA_easyocr_encode_train | 2023-10-09T20:57:19.000Z | [
"region:us"
] | Sharka | null | null | null | 0 | 0 | Entry not found |
ricahrd/duduss | 2023-10-09T20:30:24.000Z | [
"region:us"
] | ricahrd | null | null | null | 0 | 0 | Entry not found |
totally-not-an-llm/ZorgonChat | 2023-10-09T20:48:26.000Z | [
"license:mit",
"region:us"
] | totally-not-an-llm | null | null | null | 0 | 0 | ---
license: mit
---
|
ArmelRandy/oa_lima_strat_qcm | 2023-10-09T20:43:48.000Z | [
"region:us"
] | ArmelRandy | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 24587836.604066804
num_examples: 18828
- name: test
num_bytes: 1294165.3959331955
num_examples: 991
download_size: 16177809
dataset_size: 25882002.0
---
# Dataset Card for "oa_lima_strat_qcm"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fimu-docproc-research/CIVQA_easyocr_encode_train | 2023-10-09T20:54:46.000Z | [
"region:us"
] | fimu-docproc-research | null | null | null | 0 | 0 | Entry not found |
felipeoes/filtered_raw_text_legislatio_blue_amazon | 2023-10-09T20:51:58.000Z | [
"region:us"
] | felipeoes | null | null | null | 0 | 0 | Entry not found |
ostapeno/cot | 2023-10-09T21:04:34.000Z | [
"region:us"
] | ostapeno | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: dataset
dtype: string
- name: id
dtype: string
- name: messages
list:
- name: role
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 115613738
num_examples: 100000
download_size: 52113324
dataset_size: 115613738
---
# Dataset Card for "cot"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ostapeno/dolly | 2023-10-09T21:04:36.000Z | [
"region:us"
] | ostapeno | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: dataset
dtype: string
- name: id
dtype: string
- name: messages
list:
- name: role
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 13007120
num_examples: 15011
download_size: 7493126
dataset_size: 13007120
---
# Dataset Card for "dolly"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ostapeno/oasst1 | 2023-10-09T21:04:39.000Z | [
"region:us"
] | ostapeno | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: dataset
dtype: string
- name: id
dtype: string
- name: messages
list:
- name: role
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 51422776
num_examples: 33919
download_size: 20867411
dataset_size: 51422776
---
# Dataset Card for "oasst1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ostapeno/sharegpt | 2023-10-09T21:04:57.000Z | [
"region:us"
] | ostapeno | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: dataset
dtype: string
- name: id
dtype: string
- name: messages
list:
- name: role
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 815707764
num_examples: 168864
download_size: 347091152
dataset_size: 815707764
---
# Dataset Card for "sharegpt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ostapeno/stanford_alpaca | 2023-10-09T21:05:00.000Z | [
"region:us"
] | ostapeno | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: dataset
dtype: string
- name: id
dtype: string
- name: messages
list:
- name: role
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 23769688
num_examples: 52002
download_size: 12254044
dataset_size: 23769688
---
# Dataset Card for "stanford_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ostapeno/self_instruct | 2023-10-09T21:05:03.000Z | [
"region:us"
] | ostapeno | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: dataset
dtype: string
- name: id
dtype: string
- name: messages
list:
- name: role
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 27516583
num_examples: 82439
download_size: 11204230
dataset_size: 27516583
---
# Dataset Card for "self_instruct"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ostapeno/gpt4_alpaca | 2023-10-09T21:05:06.000Z | [
"region:us"
] | ostapeno | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: dataset
dtype: string
- name: id
dtype: string
- name: messages
list:
- name: role
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 44542771
num_examples: 52002
download_size: 24271598
dataset_size: 44542771
---
# Dataset Card for "gpt4_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ostapeno/code_alpaca | 2023-10-09T21:05:09.000Z | [
"region:us"
] | ostapeno | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: dataset
dtype: string
- name: id
dtype: string
- name: messages
list:
- name: role
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 7830075
num_examples: 20022
download_size: 3538209
dataset_size: 7830075
---
# Dataset Card for "code_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
YaHi/chinese_AAAI_Math | 2023-10-09T21:06:28.000Z | [
"region:us"
] | YaHi | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: dataset_version
dtype: timestamp[s]
- name: queId
dtype: string
- name: difficulty
dtype: string
- name: qtype
dtype: string
- name: problem
dtype: string
- name: knowledge_point_routes
sequence: string
splits:
- name: train
num_bytes: 2911523
num_examples: 7436
download_size: 1485592
dataset_size: 2911523
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "chinese_AAAI_Math"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jamescalam/ai-arxiv | 2023-10-10T12:57:37.000Z | [
"region:us"
] | jamescalam | null | null | null | 4 | 0 | Entry not found |
jamescalam/ai-arxiv-chunked | 2023-10-10T12:56:09.000Z | [
"region:us"
] | jamescalam | null | null | null | 6 | 0 | Entry not found |
nightmare-nectarine/segmentation-carla-driving | 2023-10-09T21:15:59.000Z | [
"license:mit",
"region:us"
] | nightmare-nectarine | null | null | null | 0 | 0 | ---
license: mit
---
|
autoevaluate/autoeval-eval-squad_v2-squad_v2-6e4e67-94066145983 | 2023-10-09T21:22:42.000Z | [
"region:us"
] | autoevaluate | null | null | null | 0 | 0 | Entry not found |
ostapeno/tulu_v2_cot_subset | 2023-10-09T21:23:58.000Z | [
"region:us"
] | ostapeno | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: dataset
dtype: string
- name: id
dtype: string
- name: messages
list:
- name: role
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 57705790
num_examples: 50000
download_size: 25971959
dataset_size: 57705790
---
# Dataset Card for "tulu_v2_cot_subset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ostapeno/tulu_v2_flan_v2_subset | 2023-10-09T21:24:03.000Z | [
"region:us"
] | ostapeno | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: dataset
dtype: string
- name: id
dtype: string
- name: messages
list:
- name: role
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 111227584
num_examples: 50000
download_size: 64903414
dataset_size: 111227584
---
# Dataset Card for "tulu_v2_flan_v2_subset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ostapeno/tulu_v2_oasst1_subset | 2023-10-09T21:24:26.000Z | [
"region:us"
] | ostapeno | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: dataset
dtype: string
- name: id
dtype: string
- name: messages
list:
- name: role
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 12306024
num_examples: 7708
download_size: 7059985
dataset_size: 12306024
---
# Dataset Card for "tulu_v2_oasst1_subset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ostapeno/tulu_v2_gpt4_alpaca_subset | 2023-10-09T21:24:33.000Z | [
"region:us"
] | ostapeno | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: dataset
dtype: string
- name: id
dtype: string
- name: messages
list:
- name: role
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 16994301
num_examples: 20000
download_size: 9302507
dataset_size: 16994301
---
# Dataset Card for "tulu_v2_gpt4_alpaca_subset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ostapeno/tulu_v2_code_alpaca_subset | 2023-10-09T21:24:35.000Z | [
"region:us"
] | ostapeno | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: dataset
dtype: string
- name: id
dtype: string
- name: messages
list:
- name: role
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 7823498
num_examples: 20022
download_size: 3528838
dataset_size: 7823498
---
# Dataset Card for "tulu_v2_code_alpaca_subset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlignmentLab-AI/EverythingIsAllYouNeed | 2023-10-09T21:29:08.000Z | [
"region:us"
] | AlignmentLab-AI | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_xiaol__RWKV-v4-raven-14B-one-state | 2023-10-09T21:39:54.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of xiaol/RWKV-v4-raven-14B-one-state
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [xiaol/RWKV-v4-raven-14B-one-state](https://huggingface.co/xiaol/RWKV-v4-raven-14B-one-state)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xiaol__RWKV-v4-raven-14B-one-state\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-09T21:38:42.028709](https://huggingface.co/datasets/open-llm-leaderboard/details_xiaol__RWKV-v4-raven-14B-one-state/blob/main/results_2023-10-09T21-38-42.028709.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.33924685524661535,\n\
\ \"acc_stderr\": 0.03400094010286168,\n \"acc_norm\": 0.3432206955736541,\n\
\ \"acc_norm_stderr\": 0.03399555734342263,\n \"mc1\": 0.2484700122399021,\n\
\ \"mc1_stderr\": 0.015127427096520681,\n \"mc2\": 0.37298301233557335,\n\
\ \"mc2_stderr\": 0.014007983938605419\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.41467576791808874,\n \"acc_stderr\": 0.014397070564409172,\n\
\ \"acc_norm\": 0.45733788395904434,\n \"acc_norm_stderr\": 0.01455810654392407\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5230033857797252,\n\
\ \"acc_stderr\": 0.004984497871025246,\n \"acc_norm\": 0.714797849034057,\n\
\ \"acc_norm_stderr\": 0.00450587908460685\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34074074074074073,\n\
\ \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.34074074074074073,\n\
\ \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2236842105263158,\n \"acc_stderr\": 0.033911609343436004,\n\
\ \"acc_norm\": 0.2236842105263158,\n \"acc_norm_stderr\": 0.033911609343436004\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.41,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.36981132075471695,\n \"acc_stderr\": 0.029711421880107915,\n\
\ \"acc_norm\": 0.36981132075471695,\n \"acc_norm_stderr\": 0.029711421880107915\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2916666666666667,\n\
\ \"acc_stderr\": 0.03800968060554858,\n \"acc_norm\": 0.2916666666666667,\n\
\ \"acc_norm_stderr\": 0.03800968060554858\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.30057803468208094,\n\
\ \"acc_stderr\": 0.0349610148119118,\n \"acc_norm\": 0.30057803468208094,\n\
\ \"acc_norm_stderr\": 0.0349610148119118\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929777,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929777\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3404255319148936,\n \"acc_stderr\": 0.03097669299853443,\n\
\ \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.03097669299853443\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.0433913832257986,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.0433913832257986\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3448275862068966,\n \"acc_stderr\": 0.03960933549451208,\n\
\ \"acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.03960933549451208\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24867724867724866,\n \"acc_stderr\": 0.02226181769240017,\n \"\
acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.02226181769240017\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.04073524322147126,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.04073524322147126\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3419354838709677,\n\
\ \"acc_stderr\": 0.026985289576552742,\n \"acc_norm\": 0.3419354838709677,\n\
\ \"acc_norm_stderr\": 0.026985289576552742\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.26108374384236455,\n \"acc_stderr\": 0.030903796952114482,\n\
\ \"acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.030903796952114482\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.48484848484848486,\n \"acc_stderr\": 0.03902551007374448,\n\
\ \"acc_norm\": 0.48484848484848486,\n \"acc_norm_stderr\": 0.03902551007374448\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.29292929292929293,\n \"acc_stderr\": 0.03242497958178815,\n \"\
acc_norm\": 0.29292929292929293,\n \"acc_norm_stderr\": 0.03242497958178815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.37305699481865284,\n \"acc_stderr\": 0.03490205592048575,\n\
\ \"acc_norm\": 0.37305699481865284,\n \"acc_norm_stderr\": 0.03490205592048575\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.27692307692307694,\n \"acc_stderr\": 0.022688042352424994,\n\
\ \"acc_norm\": 0.27692307692307694,\n \"acc_norm_stderr\": 0.022688042352424994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712166,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712166\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2689075630252101,\n \"acc_stderr\": 0.028801392193631276,\n\
\ \"acc_norm\": 0.2689075630252101,\n \"acc_norm_stderr\": 0.028801392193631276\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2119205298013245,\n \"acc_stderr\": 0.033367670865679766,\n \"\
acc_norm\": 0.2119205298013245,\n \"acc_norm_stderr\": 0.033367670865679766\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3798165137614679,\n \"acc_stderr\": 0.020808825617866244,\n \"\
acc_norm\": 0.3798165137614679,\n \"acc_norm_stderr\": 0.020808825617866244\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2037037037037037,\n \"acc_stderr\": 0.027467401804057986,\n \"\
acc_norm\": 0.2037037037037037,\n \"acc_norm_stderr\": 0.027467401804057986\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4166666666666667,\n \"acc_stderr\": 0.03460228327239171,\n \"\
acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03460228327239171\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5232067510548524,\n \"acc_stderr\": 0.032512152011410174,\n \
\ \"acc_norm\": 0.5232067510548524,\n \"acc_norm_stderr\": 0.032512152011410174\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.47085201793721976,\n\
\ \"acc_stderr\": 0.03350073248773404,\n \"acc_norm\": 0.47085201793721976,\n\
\ \"acc_norm_stderr\": 0.03350073248773404\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3435114503816794,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.3435114503816794,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4132231404958678,\n \"acc_stderr\": 0.04495087843548408,\n \"\
acc_norm\": 0.4132231404958678,\n \"acc_norm_stderr\": 0.04495087843548408\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4351851851851852,\n\
\ \"acc_stderr\": 0.04792898170907061,\n \"acc_norm\": 0.4351851851851852,\n\
\ \"acc_norm_stderr\": 0.04792898170907061\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3128834355828221,\n \"acc_stderr\": 0.036429145782924055,\n\
\ \"acc_norm\": 0.3128834355828221,\n \"acc_norm_stderr\": 0.036429145782924055\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.044328040552915185,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.044328040552915185\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.27184466019417475,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.27184466019417475,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.44871794871794873,\n\
\ \"acc_stderr\": 0.032583346493868806,\n \"acc_norm\": 0.44871794871794873,\n\
\ \"acc_norm_stderr\": 0.032583346493868806\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.41507024265644954,\n\
\ \"acc_stderr\": 0.017620137003655268,\n \"acc_norm\": 0.41507024265644954,\n\
\ \"acc_norm_stderr\": 0.017620137003655268\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3988439306358382,\n \"acc_stderr\": 0.026362437574546534,\n\
\ \"acc_norm\": 0.3988439306358382,\n \"acc_norm_stderr\": 0.026362437574546534\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24581005586592178,\n\
\ \"acc_stderr\": 0.014400296429225596,\n \"acc_norm\": 0.24581005586592178,\n\
\ \"acc_norm_stderr\": 0.014400296429225596\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.026568921015457152,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.026568921015457152\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3408360128617363,\n\
\ \"acc_stderr\": 0.026920841260776162,\n \"acc_norm\": 0.3408360128617363,\n\
\ \"acc_norm_stderr\": 0.026920841260776162\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.33641975308641975,\n \"acc_stderr\": 0.026289734945952926,\n\
\ \"acc_norm\": 0.33641975308641975,\n \"acc_norm_stderr\": 0.026289734945952926\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.26595744680851063,\n \"acc_stderr\": 0.026358065698880596,\n \
\ \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.026358065698880596\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.33116036505867014,\n\
\ \"acc_stderr\": 0.012020128195985757,\n \"acc_norm\": 0.33116036505867014,\n\
\ \"acc_norm_stderr\": 0.012020128195985757\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.2867647058823529,\n \"acc_stderr\": 0.02747227447323382,\n\
\ \"acc_norm\": 0.2867647058823529,\n \"acc_norm_stderr\": 0.02747227447323382\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3235294117647059,\n \"acc_stderr\": 0.018926082916083393,\n \
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.018926082916083393\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.37272727272727274,\n\
\ \"acc_stderr\": 0.04631381319425464,\n \"acc_norm\": 0.37272727272727274,\n\
\ \"acc_norm_stderr\": 0.04631381319425464\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24489795918367346,\n \"acc_stderr\": 0.027529637440174913,\n\
\ \"acc_norm\": 0.24489795918367346,\n \"acc_norm_stderr\": 0.027529637440174913\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4228855721393035,\n\
\ \"acc_stderr\": 0.034932317774212816,\n \"acc_norm\": 0.4228855721393035,\n\
\ \"acc_norm_stderr\": 0.034932317774212816\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.03811079669833531,\n\
\ \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03811079669833531\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2484700122399021,\n\
\ \"mc1_stderr\": 0.015127427096520681,\n \"mc2\": 0.37298301233557335,\n\
\ \"mc2_stderr\": 0.014007983938605419\n }\n}\n```"
repo_url: https://huggingface.co/xiaol/RWKV-v4-raven-14B-one-state
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|arc:challenge|25_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hellaswag|10_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T21-38-42.028709.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T21-38-42.028709.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T21-38-42.028709.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T21-38-42.028709.parquet'
- config_name: results
data_files:
- split: 2023_10_09T21_38_42.028709
path:
- results_2023-10-09T21-38-42.028709.parquet
- split: latest
path:
- results_2023-10-09T21-38-42.028709.parquet
---
# Dataset Card for Evaluation run of xiaol/RWKV-v4-raven-14B-one-state
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/xiaol/RWKV-v4-raven-14B-one-state
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [xiaol/RWKV-v4-raven-14B-one-state](https://huggingface.co/xiaol/RWKV-v4-raven-14B-one-state) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xiaol__RWKV-v4-raven-14B-one-state",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-09T21:38:42.028709](https://huggingface.co/datasets/open-llm-leaderboard/details_xiaol__RWKV-v4-raven-14B-one-state/blob/main/results_2023-10-09T21-38-42.028709.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.33924685524661535,
"acc_stderr": 0.03400094010286168,
"acc_norm": 0.3432206955736541,
"acc_norm_stderr": 0.03399555734342263,
"mc1": 0.2484700122399021,
"mc1_stderr": 0.015127427096520681,
"mc2": 0.37298301233557335,
"mc2_stderr": 0.014007983938605419
},
"harness|arc:challenge|25": {
"acc": 0.41467576791808874,
"acc_stderr": 0.014397070564409172,
"acc_norm": 0.45733788395904434,
"acc_norm_stderr": 0.01455810654392407
},
"harness|hellaswag|10": {
"acc": 0.5230033857797252,
"acc_stderr": 0.004984497871025246,
"acc_norm": 0.714797849034057,
"acc_norm_stderr": 0.00450587908460685
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.040943762699967926,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.040943762699967926
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2236842105263158,
"acc_stderr": 0.033911609343436004,
"acc_norm": 0.2236842105263158,
"acc_norm_stderr": 0.033911609343436004
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.36981132075471695,
"acc_stderr": 0.029711421880107915,
"acc_norm": 0.36981132075471695,
"acc_norm_stderr": 0.029711421880107915
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.03800968060554858,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.03800968060554858
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.17,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.17,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.30057803468208094,
"acc_stderr": 0.0349610148119118,
"acc_norm": 0.30057803468208094,
"acc_norm_stderr": 0.0349610148119118
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929777,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3404255319148936,
"acc_stderr": 0.03097669299853443,
"acc_norm": 0.3404255319148936,
"acc_norm_stderr": 0.03097669299853443
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.0433913832257986,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.0433913832257986
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.03960933549451208,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.03960933549451208
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.02226181769240017,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.02226181769240017
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.04073524322147126,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.04073524322147126
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3419354838709677,
"acc_stderr": 0.026985289576552742,
"acc_norm": 0.3419354838709677,
"acc_norm_stderr": 0.026985289576552742
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.030903796952114482,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.030903796952114482
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.48484848484848486,
"acc_stderr": 0.03902551007374448,
"acc_norm": 0.48484848484848486,
"acc_norm_stderr": 0.03902551007374448
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.29292929292929293,
"acc_stderr": 0.03242497958178815,
"acc_norm": 0.29292929292929293,
"acc_norm_stderr": 0.03242497958178815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.37305699481865284,
"acc_stderr": 0.03490205592048575,
"acc_norm": 0.37305699481865284,
"acc_norm_stderr": 0.03490205592048575
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.27692307692307694,
"acc_stderr": 0.022688042352424994,
"acc_norm": 0.27692307692307694,
"acc_norm_stderr": 0.022688042352424994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712166,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712166
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2689075630252101,
"acc_stderr": 0.028801392193631276,
"acc_norm": 0.2689075630252101,
"acc_norm_stderr": 0.028801392193631276
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2119205298013245,
"acc_stderr": 0.033367670865679766,
"acc_norm": 0.2119205298013245,
"acc_norm_stderr": 0.033367670865679766
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3798165137614679,
"acc_stderr": 0.020808825617866244,
"acc_norm": 0.3798165137614679,
"acc_norm_stderr": 0.020808825617866244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.027467401804057986,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.027467401804057986
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03460228327239171,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03460228327239171
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5232067510548524,
"acc_stderr": 0.032512152011410174,
"acc_norm": 0.5232067510548524,
"acc_norm_stderr": 0.032512152011410174
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.47085201793721976,
"acc_stderr": 0.03350073248773404,
"acc_norm": 0.47085201793721976,
"acc_norm_stderr": 0.03350073248773404
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3435114503816794,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.3435114503816794,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4132231404958678,
"acc_stderr": 0.04495087843548408,
"acc_norm": 0.4132231404958678,
"acc_norm_stderr": 0.04495087843548408
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.04792898170907061,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.04792898170907061
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3128834355828221,
"acc_stderr": 0.036429145782924055,
"acc_norm": 0.3128834355828221,
"acc_norm_stderr": 0.036429145782924055
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.044328040552915185,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.044328040552915185
},
"harness|hendrycksTest-management|5": {
"acc": 0.27184466019417475,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.27184466019417475,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.44871794871794873,
"acc_stderr": 0.032583346493868806,
"acc_norm": 0.44871794871794873,
"acc_norm_stderr": 0.032583346493868806
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.41507024265644954,
"acc_stderr": 0.017620137003655268,
"acc_norm": 0.41507024265644954,
"acc_norm_stderr": 0.017620137003655268
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3988439306358382,
"acc_stderr": 0.026362437574546534,
"acc_norm": 0.3988439306358382,
"acc_norm_stderr": 0.026362437574546534
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24581005586592178,
"acc_stderr": 0.014400296429225596,
"acc_norm": 0.24581005586592178,
"acc_norm_stderr": 0.014400296429225596
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.026568921015457152,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.026568921015457152
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3408360128617363,
"acc_stderr": 0.026920841260776162,
"acc_norm": 0.3408360128617363,
"acc_norm_stderr": 0.026920841260776162
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.33641975308641975,
"acc_stderr": 0.026289734945952926,
"acc_norm": 0.33641975308641975,
"acc_norm_stderr": 0.026289734945952926
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.026358065698880596,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.026358065698880596
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.33116036505867014,
"acc_stderr": 0.012020128195985757,
"acc_norm": 0.33116036505867014,
"acc_norm_stderr": 0.012020128195985757
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2867647058823529,
"acc_stderr": 0.02747227447323382,
"acc_norm": 0.2867647058823529,
"acc_norm_stderr": 0.02747227447323382
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.018926082916083393,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.018926082916083393
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.37272727272727274,
"acc_stderr": 0.04631381319425464,
"acc_norm": 0.37272727272727274,
"acc_norm_stderr": 0.04631381319425464
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24489795918367346,
"acc_stderr": 0.027529637440174913,
"acc_norm": 0.24489795918367346,
"acc_norm_stderr": 0.027529637440174913
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.4228855721393035,
"acc_stderr": 0.034932317774212816,
"acc_norm": 0.4228855721393035,
"acc_norm_stderr": 0.034932317774212816
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03811079669833531,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03811079669833531
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2484700122399021,
"mc1_stderr": 0.015127427096520681,
"mc2": 0.37298301233557335,
"mc2_stderr": 0.014007983938605419
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
blockplacer4/hobby-dataset-v4 | 2023-10-11T00:52:35.000Z | [
"region:us"
] | blockplacer4 | null | null | null | 0 | 0 | Entry not found |
hmao/rule_learning_data_v0 | 2023-10-09T22:28:43.000Z | [
"region:us"
] | hmao | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: rule
dtype: string
- name: task_name
dtype: string
- name: configuration
dtype: string
- name: description
dtype: string
- name: filepath
dtype: string
- name: old_instruction
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 6226117
num_examples: 2009
download_size: 2213175
dataset_size: 6226117
---
# Dataset Card for "rule_learning_data_v0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hacktoberfest-corpus-es/newyorker_caption_contest_spanish | 2023-10-10T04:22:41.000Z | [
"license:cc-by-2.0",
"region:us"
] | hacktoberfest-corpus-es | null | null | null | 0 | 0 | ---
license: cc-by-2.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: contest_number
dtype: int32
- name: image_location
dtype: string
- name: image_description
dtype: string
- name: image_uncanny_description
dtype: string
- name: entities
sequence: string
- name: questions
sequence: string
- name: caption_choices
dtype: string
- name: from_description
dtype: string
- name: label
dtype: string
- name: n_tokens_label
dtype: int32
- name: instance_id
dtype: string
splits:
- name: train
num_bytes: 134115134.64
num_examples: 2340
- name: validation
num_bytes: 8055329.0
num_examples: 130
- name: test
num_bytes: 6878764.0
num_examples: 131
download_size: 139896532
dataset_size: 149049227.64
---
|
aurob96/your-dataset-name | 2023-10-10T15:17:57.000Z | [
"region:us"
] | aurob96 | null | null | null | 0 | 0 | Entry not found |
hmao/rule-sql-v1 | 2023-10-09T22:43:35.000Z | [
"region:us"
] | hmao | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: response
dtype: string
- name: source
dtype: string
- name: text
dtype: string
- name: rule
dtype: string
- name: software
dtype: string
splits:
- name: train
num_bytes: 863452252
num_examples: 262208
download_size: 225135160
dataset_size: 863452252
---
# Dataset Card for "rule-sql-v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.