datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
kaczmarj/wsinfer-model-zoo-json | ---
license: cc-by-4.0
pretty_name: WSInfer Model Zoo Registry
---
This is the registry of models in the WSInfer Model Zoo.
See https://wsinfer.readthedocs.io/en/latest/ and https://github.com/SBU-BMI/wsinfer-zoo for more information.
|
Ujito/Mukozo | ---
license: bigscience-openrail-m
---
|
alisson40889/ball | ---
license: openrail
---
|
p1atdev/zozotown | ---
license: cc0-1.0
---
|
rai-sandeep/dataset_format | ---
dataset_info:
features:
- name: task
dtype: string
- name: format
dtype: string
splits:
- name: train
num_bytes: 352
num_examples: 2
download_size: 2327
dataset_size: 352
---
# Dataset Card for "dataset_format"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mHossain/final_train_v4_test_80000 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: input_text
dtype: string
- name: target_text
dtype: string
- name: prefix
dtype: string
splits:
- name: train
num_bytes: 5763784.5
num_examples: 18000
- name: test
num_bytes: 640420.5
num_examples: 2000
download_size: 2785737
dataset_size: 6404205.0
---
# Dataset Card for "final_train_v4_test_80000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ovior/twitter_dataset_1713124765 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2575033
num_examples: 8102
download_size: 1442038
dataset_size: 2575033
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Utkarsh55/guanaco-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966694
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Oztobuzz/Processed_Vi_ShareGPT4V | ---
configs:
- config_name: default
data_files:
- split: train
path: data/*/train-*
- config_name: start_from_000116069
data_files:
- split: train
path: data/start_from_000116069/train-*
- config_name: start_from_000118857
data_files:
- split: train
path: data/start_from_000118857/train-*
- config_name: start_from_000119137
data_files:
- split: train
path: data/start_from_000119137/train-*
- config_name: start_from_000119739
data_files:
- split: train
path: data/start_from_000119739/train-*
- config_name: start_from_000121745
data_files:
- split: train
path: data/start_from_000121745/train-*
- config_name: start_from_000122687
data_files:
- split: train
path: data/start_from_000122687/train-*
- config_name: start_from_000122959
data_files:
- split: train
path: data/start_from_000122959/train-*
- config_name: start_from_000123104
data_files:
- split: train
path: data/start_from_000123104/train-*
- config_name: start_from_000123658
data_files:
- split: train
path: data/start_from_000123658/train-*
- config_name: start_from_000125179
data_files:
- split: train
path: data/start_from_000125179/train-*
- config_name: start_from_000125996
data_files:
- split: train
path: data/start_from_000125996/train-*
- config_name: start_from_000127429
data_files:
- split: train
path: data/start_from_000127429/train-*
- config_name: start_from_000129186
data_files:
- split: train
path: data/start_from_000129186/train-*
- config_name: start_from_000146766
data_files:
- split: train
path: data/start_from_000146766/train-*
- config_name: start_from_000147346
data_files:
- split: train
path: data/start_from_000147346/train-*
- config_name: start_from_000147919
data_files:
- split: train
path: data/start_from_000147919/train-*
- config_name: start_from_000148568
data_files:
- split: train
path: data/start_from_000148568/train-*
- config_name: start_from_000148869
data_files:
- split: train
path: data/start_from_000148869/train-*
- config_name: start_from_000149819
data_files:
- split: train
path: data/start_from_000149819/train-*
- config_name: start_from_000157150
data_files:
- split: train
path: data/start_from_000157150/train-*
- config_name: start_from_000157834
data_files:
- split: train
path: data/start_from_000157834/train-*
- config_name: start_from_000159178
data_files:
- split: train
path: data/start_from_000159178/train-*
- config_name: start_from_000165035
data_files:
- split: train
path: data/start_from_000165035/train-*
- config_name: start_from_000165093
data_files:
- split: train
path: data/start_from_000165093/train-*
- config_name: start_from_000165799
data_files:
- split: train
path: data/start_from_000165799/train-*
- config_name: start_from_000166028
data_files:
- split: train
path: data/start_from_000166028/train-*
- config_name: start_from_000166620
data_files:
- split: train
path: data/start_from_000166620/train-*
- config_name: start_from_000167861
data_files:
- split: train
path: data/start_from_000167861/train-*
- config_name: start_from_000170275
data_files:
- split: train
path: data/start_from_000170275/train-*
- config_name: start_from_000170846
data_files:
- split: train
path: data/start_from_000170846/train-*
- config_name: start_from_000171526
data_files:
- split: train
path: data/start_from_000171526/train-*
- config_name: start_from_000171927
data_files:
- split: train
path: data/start_from_000171927/train-*
- config_name: start_from_000172463
data_files:
- split: train
path: data/start_from_000172463/train-*
- config_name: start_from_000172839
data_files:
- split: train
path: data/start_from_000172839/train-*
- config_name: start_from_000173282
data_files:
- split: train
path: data/start_from_000173282/train-*
- config_name: start_from_000174034
data_files:
- split: train
path: data/start_from_000174034/train-*
- config_name: start_from_000174960
data_files:
- split: train
path: data/start_from_000174960/train-*
- config_name: start_from_000175388
data_files:
- split: train
path: data/start_from_000175388/train-*
- config_name: start_from_000175987
data_files:
- split: train
path: data/start_from_000175987/train-*
- config_name: start_from_000176475
data_files:
- split: train
path: data/start_from_000176475/train-*
- config_name: start_from_000176986
data_files:
- split: train
path: data/start_from_000176986/train-*
- config_name: start_from_000177287
data_files:
- split: train
path: data/start_from_000177287/train-*
- config_name: start_from_000178012
data_files:
- split: train
path: data/start_from_000178012/train-*
- config_name: start_from_000183715
data_files:
- split: train
path: data/start_from_000183715/train-*
- config_name: start_from_000184954
data_files:
- split: train
path: data/start_from_000184954/train-*
- config_name: start_from_000185912
data_files:
- split: train
path: data/start_from_000185912/train-*
- config_name: start_from_000187200
data_files:
- split: train
path: data/start_from_000187200/train-*
- config_name: start_from_000189391
data_files:
- split: train
path: data/start_from_000189391/train-*
- config_name: start_from_000190047
data_files:
- split: train
path: data/start_from_000190047/train-*
- config_name: start_from_000250724
data_files:
- split: train
path: data/start_from_000250724/train-*
- config_name: start_from_000250766
data_files:
- split: train
path: data/start_from_000250766/train-*
- config_name: start_from_03535a6be5d036d6
data_files:
- split: train
path: data/start_from_03535a6be5d036d6/train-*
- config_name: start_from_2b3e5e8b672a73e7
data_files:
- split: train
path: data/start_from_2b3e5e8b672a73e7/train-*
- config_name: start_from_5ff2daab3483a646
data_files:
- split: train
path: data/start_from_5ff2daab3483a646/train-*
- config_name: start_from_9ce1fa0dff30e7d9
data_files:
- split: train
path: data/start_from_9ce1fa0dff30e7d9/train-*
- config_name: start_from_Anthony_Hopkins
data_files:
- split: train
path: data/start_from_Anthony_Hopkins/train-*
- config_name: start_from_Elliot_Page
data_files:
- split: train
path: data/start_from_Elliot_Page/train-*
- config_name: start_from_aubrey-beardsley_no-5
data_files:
- split: train
path: data/start_from_aubrey-beardsley_no-5/train-*
- config_name: start_from_genevieve-asse_ligne-rouge-ii-n-19-1984
data_files:
- split: train
path: data/start_from_genevieve-asse_ligne-rouge-ii-n-19-1984/train-*
- config_name: start_from_john-henry-twachtman_the-portico
data_files:
- split: train
path: data/start_from_john-henry-twachtman_the-portico/train-*
- config_name: start_from_nicholas-roerich_spies-1900
data_files:
- split: train
path: data/start_from_nicholas-roerich_spies-1900/train-*
- config_name: start_from_sa_2579
data_files:
- split: train
path: data/start_from_sa_2579/train-*
- config_name: start_from_sa_2588
data_files:
- split: train
path: data/start_from_sa_2588/train-*
- config_name: start_from_sa_2597
data_files:
- split: train
path: data/start_from_sa_2597/train-*
- config_name: start_from_sa_26059
data_files:
- split: train
path: data/start_from_sa_26059/train-*
- config_name: start_from_sa_2615
data_files:
- split: train
path: data/start_from_sa_2615/train-*
- config_name: start_from_sa_2624
data_files:
- split: train
path: data/start_from_sa_2624/train-*
- config_name: start_from_sa_2633
data_files:
- split: train
path: data/start_from_sa_2633/train-*
- config_name: start_from_sa_2642
data_files:
- split: train
path: data/start_from_sa_2642/train-*
- config_name: start_from_sa_2651
data_files:
- split: train
path: data/start_from_sa_2651/train-*
- config_name: start_from_sa_2660
data_files:
- split: train
path: data/start_from_sa_2660/train-*
- config_name: start_from_sa_26690
data_files:
- split: train
path: data/start_from_sa_26690/train-*
- config_name: start_from_sa_27880
data_files:
- split: train
path: data/start_from_sa_27880/train-*
dataset_info:
- config_name: start_from_000116069
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 243750
num_examples: 100
download_size: 104130
dataset_size: 243750
- config_name: start_from_000118857
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 264039
num_examples: 100
download_size: 116699
dataset_size: 264039
- config_name: start_from_000119137
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 245729
num_examples: 100
download_size: 107775
dataset_size: 245729
- config_name: start_from_000119739
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 266007
num_examples: 100
download_size: 111382
dataset_size: 266007
- config_name: start_from_000121745
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 276381
num_examples: 100
download_size: 110925
dataset_size: 276381
- config_name: start_from_000122687
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 240984
num_examples: 100
download_size: 103289
dataset_size: 240984
- config_name: start_from_000122959
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 235845
num_examples: 100
download_size: 101022
dataset_size: 235845
- config_name: start_from_000123104
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 240498
num_examples: 100
download_size: 105487
dataset_size: 240498
- config_name: start_from_000123658
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 282765
num_examples: 100
download_size: 116701
dataset_size: 282765
- config_name: start_from_000125179
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 267968
num_examples: 100
download_size: 108663
dataset_size: 267968
- config_name: start_from_000125996
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 239475
num_examples: 100
download_size: 106381
dataset_size: 239475
- config_name: start_from_000127429
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 262902
num_examples: 100
download_size: 106254
dataset_size: 262902
- config_name: start_from_000129186
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 255955
num_examples: 100
download_size: 109381
dataset_size: 255955
- config_name: start_from_000146766
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 252468
num_examples: 100
download_size: 121248
dataset_size: 252468
- config_name: start_from_000147346
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 249161
num_examples: 100
download_size: 119582
dataset_size: 249161
- config_name: start_from_000147919
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 263790
num_examples: 100
download_size: 124679
dataset_size: 263790
- config_name: start_from_000148568
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 232832
num_examples: 100
download_size: 104669
dataset_size: 232832
- config_name: start_from_000148869
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 239642
num_examples: 100
download_size: 112592
dataset_size: 239642
- config_name: start_from_000149819
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 252170
num_examples: 100
download_size: 121519
dataset_size: 252170
- config_name: start_from_000157150
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 263540
num_examples: 100
download_size: 123396
dataset_size: 263540
- config_name: start_from_000157834
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 236072
num_examples: 100
download_size: 112569
dataset_size: 236072
- config_name: start_from_000159178
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 234745
num_examples: 100
download_size: 106931
dataset_size: 234745
- config_name: start_from_000165035
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 45010
num_examples: 10
download_size: 21035
dataset_size: 45010
- config_name: start_from_000165093
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 429536
num_examples: 170
download_size: 184772
dataset_size: 429536
- config_name: start_from_000165799
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 198854
num_examples: 80
download_size: 91717
dataset_size: 198854
- config_name: start_from_000166028
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 232523
num_examples: 100
download_size: 108358
dataset_size: 232523
- config_name: start_from_000166620
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 288952
num_examples: 120
download_size: 137851
dataset_size: 288952
- config_name: start_from_000167861
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 255546
num_examples: 100
download_size: 121945
dataset_size: 255546
- config_name: start_from_000170275
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 221779
num_examples: 90
download_size: 108722
dataset_size: 221779
- config_name: start_from_000170846
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 242958
num_examples: 100
download_size: 116596
dataset_size: 242958
- config_name: start_from_000171526
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 240000
num_examples: 90
download_size: 113290
dataset_size: 240000
- config_name: start_from_000171927
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 243902
num_examples: 100
download_size: 113859
dataset_size: 243902
- config_name: start_from_000172463
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 246350
num_examples: 100
download_size: 111720
dataset_size: 246350
- config_name: start_from_000172839
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 279410
num_examples: 100
download_size: 123160
dataset_size: 279410
- config_name: start_from_000173282
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 242536
num_examples: 100
download_size: 117444
dataset_size: 242536
- config_name: start_from_000174034
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 257173
num_examples: 100
download_size: 128240
dataset_size: 257173
- config_name: start_from_000174960
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 243259
num_examples: 100
download_size: 113355
dataset_size: 243259
- config_name: start_from_000175388
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 259422
num_examples: 100
download_size: 122097
dataset_size: 259422
- config_name: start_from_000175987
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 260078
num_examples: 100
download_size: 122560
dataset_size: 260078
- config_name: start_from_000176475
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 231039
num_examples: 100
download_size: 110576
dataset_size: 231039
- config_name: start_from_000176986
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 256685
num_examples: 100
download_size: 111944
dataset_size: 256685
- config_name: start_from_000177287
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 232815
num_examples: 90
download_size: 110404
dataset_size: 232815
- config_name: start_from_000178012
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 240582
num_examples: 100
download_size: 114430
dataset_size: 240582
- config_name: start_from_000183715
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 176781
num_examples: 70
download_size: 87151
dataset_size: 176781
- config_name: start_from_000184954
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 282268
num_examples: 100
download_size: 128525
dataset_size: 282268
- config_name: start_from_000185912
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 246783
num_examples: 100
download_size: 115577
dataset_size: 246783
- config_name: start_from_000187200
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 238008
num_examples: 100
download_size: 111506
dataset_size: 238008
- config_name: start_from_000189391
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 273394
num_examples: 100
download_size: 119663
dataset_size: 273394
- config_name: start_from_000190047
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 235959
num_examples: 100
download_size: 110859
dataset_size: 235959
- config_name: start_from_000250724
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 28019
num_examples: 10
download_size: 19541
dataset_size: 28019
- config_name: start_from_000250766
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 266178
num_examples: 100
download_size: 125285
dataset_size: 266178
- config_name: start_from_03535a6be5d036d6
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 182476
num_examples: 100
download_size: 96673
dataset_size: 182476
- config_name: start_from_2b3e5e8b672a73e7
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 175073
num_examples: 100
download_size: 92430
dataset_size: 175073
- config_name: start_from_5ff2daab3483a646
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 180098
num_examples: 100
download_size: 97530
dataset_size: 180098
- config_name: start_from_9ce1fa0dff30e7d9
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 307568
num_examples: 190
download_size: 157394
dataset_size: 307568
- config_name: start_from_Anthony_Hopkins
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 159992
num_examples: 100
download_size: 75083
dataset_size: 159992
- config_name: start_from_Elliot_Page
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 159272
num_examples: 100
download_size: 77060
dataset_size: 159272
- config_name: start_from_aubrey-beardsley_no-5
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 182276
num_examples: 80
download_size: 89561
dataset_size: 182276
- config_name: start_from_genevieve-asse_ligne-rouge-ii-n-19-1984
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 185464
num_examples: 80
download_size: 92888
dataset_size: 185464
- config_name: start_from_john-henry-twachtman_the-portico
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 220199
num_examples: 100
download_size: 107660
dataset_size: 220199
- config_name: start_from_nicholas-roerich_spies-1900
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 249645
num_examples: 120
download_size: 118765
dataset_size: 249645
- config_name: start_from_sa_2579
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 281870
num_examples: 100
download_size: 133946
dataset_size: 281870
- config_name: start_from_sa_2588
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 279637
num_examples: 100
download_size: 128572
dataset_size: 279637
- config_name: start_from_sa_2597
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 250879
num_examples: 100
download_size: 123618
dataset_size: 250879
- config_name: start_from_sa_26059
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 258299
num_examples: 100
download_size: 128958
dataset_size: 258299
- config_name: start_from_sa_2615
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 257741
num_examples: 100
download_size: 127220
dataset_size: 257741
- config_name: start_from_sa_2624
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 256906
num_examples: 100
download_size: 128616
dataset_size: 256906
- config_name: start_from_sa_2633
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 251630
num_examples: 100
download_size: 125363
dataset_size: 251630
- config_name: start_from_sa_2642
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 258196
num_examples: 100
download_size: 128446
dataset_size: 258196
- config_name: start_from_sa_2651
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 265535
num_examples: 100
download_size: 129999
dataset_size: 265535
- config_name: start_from_sa_2660
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 261595
num_examples: 100
download_size: 129352
dataset_size: 261595
- config_name: start_from_sa_26690
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 253819
num_examples: 100
download_size: 124359
dataset_size: 253819
- config_name: start_from_sa_27880
features:
- name: id
dtype: string
- name: image
dtype: string
- name: en_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: vi_conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 233949
num_examples: 100
download_size: 106162
dataset_size: 233949
---
|
open-llm-leaderboard/details_TeeZee__DarkSapling-7B-v1.0 | ---
pretty_name: Evaluation run of TeeZee/DarkSapling-7B-v1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TeeZee/DarkSapling-7B-v1.0](https://huggingface.co/TeeZee/DarkSapling-7B-v1.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TeeZee__DarkSapling-7B-v1.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-10T01:29:46.397110](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__DarkSapling-7B-v1.0/blob/main/results_2024-02-10T01-29-46.397110.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6228272261948034,\n\
\ \"acc_stderr\": 0.032723127441021765,\n \"acc_norm\": 0.6278792920359817,\n\
\ \"acc_norm_stderr\": 0.03338301615189635,\n \"mc1\": 0.2998776009791922,\n\
\ \"mc1_stderr\": 0.016040352966713627,\n \"mc2\": 0.45088578827366993,\n\
\ \"mc2_stderr\": 0.01466973973064534\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5861774744027304,\n \"acc_stderr\": 0.014392730009221007,\n\
\ \"acc_norm\": 0.6160409556313993,\n \"acc_norm_stderr\": 0.01421244498065189\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6296554471220872,\n\
\ \"acc_stderr\": 0.004819100456867812,\n \"acc_norm\": 0.8259310894244174,\n\
\ \"acc_norm_stderr\": 0.0037839381501516165\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901409,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901409\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.038781398887976104,\n\
\ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.038781398887976104\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.6566037735849056,\n \"acc_stderr\": 0.02922452646912479,\n \
\ \"acc_norm\": 0.6566037735849056,\n \"acc_norm_stderr\": 0.02922452646912479\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\
\ \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n\
\ \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.03260038511835771,\n\
\ \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.03260038511835771\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n\
\ \"acc_stderr\": 0.04697085136647861,\n \"acc_norm\": 0.5263157894736842,\n\
\ \"acc_norm_stderr\": 0.04697085136647861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7580645161290323,\n\
\ \"acc_stderr\": 0.024362599693031096,\n \"acc_norm\": 0.7580645161290323,\n\
\ \"acc_norm_stderr\": 0.024362599693031096\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267052,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267052\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397443,\n\
\ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6307692307692307,\n \"acc_stderr\": 0.02446861524147893,\n \
\ \"acc_norm\": 0.6307692307692307,\n \"acc_norm_stderr\": 0.02446861524147893\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114986,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114986\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135363,\n\
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135363\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242741,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242741\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7963302752293578,\n \"acc_stderr\": 0.01726674208763079,\n \"\
acc_norm\": 0.7963302752293578,\n \"acc_norm_stderr\": 0.01726674208763079\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145628,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145628\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709218,\n\
\ \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709218\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8058748403575989,\n\
\ \"acc_stderr\": 0.01414397027665757,\n \"acc_norm\": 0.8058748403575989,\n\
\ \"acc_norm_stderr\": 0.01414397027665757\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.025416003773165545,\n\
\ \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.025416003773165545\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37206703910614525,\n\
\ \"acc_stderr\": 0.016165847583563295,\n \"acc_norm\": 0.37206703910614525,\n\
\ \"acc_norm_stderr\": 0.016165847583563295\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958147,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958147\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.025329888171900922,\n\
\ \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.025329888171900922\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42894393741851367,\n\
\ \"acc_stderr\": 0.012640625443067356,\n \"acc_norm\": 0.42894393741851367,\n\
\ \"acc_norm_stderr\": 0.012640625443067356\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.029097209568411952,\n\
\ \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.029097209568411952\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6405228758169934,\n \"acc_stderr\": 0.019412539242032165,\n \
\ \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.019412539242032165\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n\
\ \"acc_stderr\": 0.02768691358801301,\n \"acc_norm\": 0.8109452736318408,\n\
\ \"acc_norm_stderr\": 0.02768691358801301\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2998776009791922,\n\
\ \"mc1_stderr\": 0.016040352966713627,\n \"mc2\": 0.45088578827366993,\n\
\ \"mc2_stderr\": 0.01466973973064534\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7719021310181531,\n \"acc_stderr\": 0.011793015817663592\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.40181956027293403,\n \
\ \"acc_stderr\": 0.013504357787494032\n }\n}\n```"
repo_url: https://huggingface.co/TeeZee/DarkSapling-7B-v1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|arc:challenge|25_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|gsm8k|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hellaswag|10_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T01-29-46.397110.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T01-29-46.397110.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- '**/details_harness|winogrande|5_2024-02-10T01-29-46.397110.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-10T01-29-46.397110.parquet'
- config_name: results
data_files:
- split: 2024_02_10T01_29_46.397110
path:
- results_2024-02-10T01-29-46.397110.parquet
- split: latest
path:
- results_2024-02-10T01-29-46.397110.parquet
---
# Dataset Card for Evaluation run of TeeZee/DarkSapling-7B-v1.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [TeeZee/DarkSapling-7B-v1.0](https://huggingface.co/TeeZee/DarkSapling-7B-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TeeZee__DarkSapling-7B-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-10T01:29:46.397110](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__DarkSapling-7B-v1.0/blob/main/results_2024-02-10T01-29-46.397110.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6228272261948034,
"acc_stderr": 0.032723127441021765,
"acc_norm": 0.6278792920359817,
"acc_norm_stderr": 0.03338301615189635,
"mc1": 0.2998776009791922,
"mc1_stderr": 0.016040352966713627,
"mc2": 0.45088578827366993,
"mc2_stderr": 0.01466973973064534
},
"harness|arc:challenge|25": {
"acc": 0.5861774744027304,
"acc_stderr": 0.014392730009221007,
"acc_norm": 0.6160409556313993,
"acc_norm_stderr": 0.01421244498065189
},
"harness|hellaswag|10": {
"acc": 0.6296554471220872,
"acc_stderr": 0.004819100456867812,
"acc_norm": 0.8259310894244174,
"acc_norm_stderr": 0.0037839381501516165
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901409,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901409
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.038781398887976104,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.038781398887976104
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6566037735849056,
"acc_stderr": 0.02922452646912479,
"acc_norm": 0.6566037735849056,
"acc_norm_stderr": 0.02922452646912479
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.03260038511835771,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.03260038511835771
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.04697085136647861,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.04697085136647861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.02548718714785938,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.02548718714785938
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7580645161290323,
"acc_stderr": 0.024362599693031096,
"acc_norm": 0.7580645161290323,
"acc_norm_stderr": 0.024362599693031096
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267052,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267052
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.026499057701397443,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.026499057701397443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6307692307692307,
"acc_stderr": 0.02446861524147893,
"acc_norm": 0.6307692307692307,
"acc_norm_stderr": 0.02446861524147893
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114986,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114986
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135363,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135363
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242741,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242741
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7963302752293578,
"acc_stderr": 0.01726674208763079,
"acc_norm": 0.7963302752293578,
"acc_norm_stderr": 0.01726674208763079
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709218,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709218
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8058748403575989,
"acc_stderr": 0.01414397027665757,
"acc_norm": 0.8058748403575989,
"acc_norm_stderr": 0.01414397027665757
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.025416003773165545,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.025416003773165545
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37206703910614525,
"acc_stderr": 0.016165847583563295,
"acc_norm": 0.37206703910614525,
"acc_norm_stderr": 0.016165847583563295
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.025058503316958147,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.025058503316958147
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153266,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7067901234567902,
"acc_stderr": 0.025329888171900922,
"acc_norm": 0.7067901234567902,
"acc_norm_stderr": 0.025329888171900922
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42894393741851367,
"acc_stderr": 0.012640625443067356,
"acc_norm": 0.42894393741851367,
"acc_norm_stderr": 0.012640625443067356
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.029097209568411952,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.029097209568411952
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.019412539242032165,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.019412539242032165
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.02768691358801301,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.02768691358801301
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2998776009791922,
"mc1_stderr": 0.016040352966713627,
"mc2": 0.45088578827366993,
"mc2_stderr": 0.01466973973064534
},
"harness|winogrande|5": {
"acc": 0.7719021310181531,
"acc_stderr": 0.011793015817663592
},
"harness|gsm8k|5": {
"acc": 0.40181956027293403,
"acc_stderr": 0.013504357787494032
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
irds/nyt_wksup | ---
pretty_name: '`nyt/wksup`'
viewer: false
source_datasets: ['irds/nyt']
task_categories:
- text-retrieval
---
# Dataset Card for `nyt/wksup`
The `nyt/wksup` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/nyt#nyt/wksup).
# Data
This dataset provides:
- `queries` (i.e., topics); count=1,864,661
- `qrels`: (relevance assessments); count=1,864,661
- For `docs`, use [`irds/nyt`](https://huggingface.co/datasets/irds/nyt)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/nyt_wksup', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/nyt_wksup', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{MacAvaney2019Wksup,
author = {MacAvaney, Sean and Yates, Andrew and Hui, Kai and Frieder, Ophir},
title = {Content-Based Weak Supervision for Ad-Hoc Re-Ranking},
booktitle = {SIGIR},
year = {2019}
}
@article{Sandhaus2008Nyt,
title={The new york times annotated corpus},
author={Sandhaus, Evan},
journal={Linguistic Data Consortium, Philadelphia},
volume={6},
number={12},
pages={e26752},
year={2008}
}
```
|
open-llm-leaderboard/details_tiiuae__falcon-180B | ---
pretty_name: Evaluation run of tiiuae/falcon-180B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [tiiuae/falcon-180B](https://huggingface.co/tiiuae/falcon-180B) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 66 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 32 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_tiiuae__falcon-180B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T10:17:51.759984](https://huggingface.co/datasets/open-llm-leaderboard/details_tiiuae__falcon-180B/blob/main/results_2023-10-24T10-17-51.759984.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0028313758389261743,\n\
\ \"em_stderr\": 0.0005441551135493806,\n \"f1\": 0.06573301174496615,\n\
\ \"f1_stderr\": 0.0013666874377791776,\n \"acc\": 0.6642104078991223,\n\
\ \"acc_stderr\": 0.011605139145295384\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0028313758389261743,\n \"em_stderr\": 0.0005441551135493806,\n\
\ \"f1\": 0.06573301174496615,\n \"f1_stderr\": 0.0013666874377791776\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.45943896891584535,\n \
\ \"acc_stderr\": 0.01372709301042978\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8689818468823993,\n \"acc_stderr\": 0.009483185280160986\n\
\ }\n}\n```"
repo_url: https://huggingface.co/tiiuae/falcon-180B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|arc:challenge|25_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|arc:challenge|25_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|arc:challenge|25_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|arc:challenge|25_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|arc:challenge|25_2023-09-01T15:12:02.263774.parquet'
- split: 2023_09_25T09_30_46.601936
path:
- '**/details_harness|arc:challenge|25_2023-09-25T09-30-46.601936.parquet'
- split: 2023_09_25T09_42_43.006060
path:
- '**/details_harness|arc:challenge|25_2023-09-25T09-42-43.006060.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-25T09-42-43.006060.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_23T17_29_05.444286
path:
- '**/details_harness|drop|3_2023-10-23T17-29-05.444286.parquet'
- split: 2023_10_24T10_17_51.759984
path:
- '**/details_harness|drop|3_2023-10-24T10-17-51.759984.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T10-17-51.759984.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_23T17_29_05.444286
path:
- '**/details_harness|gsm8k|5_2023-10-23T17-29-05.444286.parquet'
- split: 2023_10_24T10_17_51.759984
path:
- '**/details_harness|gsm8k|5_2023-10-24T10-17-51.759984.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T10-17-51.759984.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hellaswag|10_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hellaswag|10_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hellaswag|10_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hellaswag|10_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hellaswag|10_2023-09-01T15:12:02.263774.parquet'
- split: 2023_09_25T11_16_10.146827
path:
- '**/details_harness|hellaswag|10_2023-09-25T11-16-10.146827.parquet'
- split: 2023_09_25T11_28_53.879118
path:
- '**/details_harness|hellaswag|10_2023-09-25T11-28-53.879118.parquet'
- split: 2023_09_25T13_20_00.898508
path:
- '**/details_harness|hellaswag|10_2023-09-25T13-20-00.898508.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-25T13-20-00.898508.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T14:31:39.488381.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T19:27:57.090829.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T01:32:36.577851.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T12:44:38.148712.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T15:12:02.263774.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T15:12:02.263774.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T15:12:02.263774.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_30T14_31_39.488381
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T14:31:39.488381.parquet'
- split: 2023_08_30T19_27_57.090829
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T19:27:57.090829.parquet'
- split: 2023_08_31T01_32_36.577851
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T01:32:36.577851.parquet'
- split: 2023_08_31T12_44_38.148712
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T12:44:38.148712.parquet'
- split: 2023_09_01T15_12_02.263774
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-01T15:12:02.263774.parquet'
- split: 2023_09_25T09_49_01.514206
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-25T09-49-01.514206.parquet'
- split: 2023_09_25T09_57_43.547983
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-25T09-57-43.547983.parquet'
- split: 2023_09_25T10_06_12.822356
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-25T10-06-12.822356.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-25T10-06-12.822356.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_23T17_29_05.444286
path:
- '**/details_harness|winogrande|5_2023-10-23T17-29-05.444286.parquet'
- split: 2023_10_24T10_17_51.759984
path:
- '**/details_harness|winogrande|5_2023-10-24T10-17-51.759984.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T10-17-51.759984.parquet'
- config_name: original_mmlu_5
data_files:
- split: 2023_09_21T14_54_28.631498
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-21T14-54-28.631498.parquet'
- split: 2023_09_21T15_14_19.361952
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-21T15-14-19.361952.parquet'
- split: 2023_09_22T15_08_20.868776
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T15-08-20.868776.parquet'
- split: 2023_09_22T15_09_58.434868
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T15-09-58.434868.parquet'
- split: 2023_09_22T15_40_03.532661
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T15-40-03.532661.parquet'
- split: 2023_09_22T19_13_36.680152
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T19-13-36.680152.parquet'
- split: 2023_09_22T19_25_51.687929
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T19-25-51.687929.parquet'
- split: 2023_09_22T19_38_30.055713
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T19-38-30.055713.parquet'
- split: 2023_09_22T19_56_14.188877
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T19-56-14.188877.parquet'
- split: 2023_09_22T20_44_00.745184
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T20-44-00.745184.parquet'
- split: 2023_09_22T21_16_36.510313
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T21-16-36.510313.parquet'
- split: 2023_09_22T21_30_38.663736
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T21-30-38.663736.parquet'
- split: 2023_09_22T21_39_07.387549
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T21-39-07.387549.parquet'
- split: 2023_09_22T21_46_48.392874
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T21-46-48.392874.parquet'
- split: 2023_09_22T22_06_13.624503
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T22-06-13.624503.parquet'
- split: 2023_09_22T22_21_06.865348
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T22-21-06.865348.parquet'
- split: 2023_09_23T09_44_24.946036
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-23T09-44-24.946036.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-23T09-44-24.946036.parquet'
- config_name: original_mmlu_high_school_government_and_politics_5
data_files:
- split: 2023_09_21T14_54_28.631498
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-21T14-54-28.631498.parquet'
- split: 2023_09_21T15_14_19.361952
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-21T15-14-19.361952.parquet'
- split: 2023_09_22T15_08_20.868776
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T15-08-20.868776.parquet'
- split: 2023_09_22T15_09_58.434868
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T15-09-58.434868.parquet'
- split: 2023_09_22T15_40_03.532661
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T15-40-03.532661.parquet'
- split: 2023_09_22T19_13_36.680152
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T19-13-36.680152.parquet'
- split: 2023_09_22T19_25_51.687929
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T19-25-51.687929.parquet'
- split: 2023_09_22T19_38_30.055713
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T19-38-30.055713.parquet'
- split: 2023_09_22T19_56_14.188877
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T19-56-14.188877.parquet'
- split: 2023_09_22T20_44_00.745184
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T20-44-00.745184.parquet'
- split: 2023_09_22T21_16_36.510313
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T21-16-36.510313.parquet'
- split: 2023_09_22T21_30_38.663736
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T21-30-38.663736.parquet'
- split: 2023_09_22T21_39_07.387549
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T21-39-07.387549.parquet'
- split: 2023_09_22T21_46_48.392874
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T21-46-48.392874.parquet'
- split: 2023_09_22T22_06_13.624503
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T22-06-13.624503.parquet'
- split: 2023_09_22T22_21_06.865348
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T22-21-06.865348.parquet'
- split: 2023_09_23T09_44_24.946036
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-23T09-44-24.946036.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-09-23T09-44-24.946036.parquet'
- config_name: results
data_files:
- split: 2023_09_21T14_54_28.631498
path:
- results_2023-09-21T14-54-28.631498.parquet
- split: 2023_09_21T15_14_19.361952
path:
- results_2023-09-21T15-14-19.361952.parquet
- split: 2023_09_22T15_08_20.868776
path:
- results_2023-09-22T15-08-20.868776.parquet
- split: 2023_09_22T15_09_58.434868
path:
- results_2023-09-22T15-09-58.434868.parquet
- split: 2023_09_22T15_40_03.532661
path:
- results_2023-09-22T15-40-03.532661.parquet
- split: 2023_09_22T19_13_36.680152
path:
- results_2023-09-22T19-13-36.680152.parquet
- split: 2023_09_22T19_25_51.687929
path:
- results_2023-09-22T19-25-51.687929.parquet
- split: 2023_09_22T19_38_30.055713
path:
- results_2023-09-22T19-38-30.055713.parquet
- split: 2023_09_22T19_56_14.188877
path:
- results_2023-09-22T19-56-14.188877.parquet
- split: 2023_09_22T20_44_00.745184
path:
- results_2023-09-22T20-44-00.745184.parquet
- split: 2023_09_22T21_16_36.510313
path:
- results_2023-09-22T21-16-36.510313.parquet
- split: 2023_09_22T21_30_38.663736
path:
- results_2023-09-22T21-30-38.663736.parquet
- split: 2023_09_22T21_39_07.387549
path:
- results_2023-09-22T21-39-07.387549.parquet
- split: 2023_09_22T21_46_48.392874
path:
- results_2023-09-22T21-46-48.392874.parquet
- split: 2023_09_22T22_06_13.624503
path:
- results_2023-09-22T22-06-13.624503.parquet
- split: 2023_09_22T22_21_06.865348
path:
- results_2023-09-22T22-21-06.865348.parquet
- split: 2023_09_23T09_44_24.946036
path:
- results_2023-09-23T09-44-24.946036.parquet
- split: 2023_09_25T09_30_46.601936
path:
- results_2023-09-25T09-30-46.601936.parquet
- split: 2023_09_25T09_42_43.006060
path:
- results_2023-09-25T09-42-43.006060.parquet
- split: 2023_09_25T09_49_01.514206
path:
- results_2023-09-25T09-49-01.514206.parquet
- split: 2023_09_25T09_57_43.547983
path:
- results_2023-09-25T09-57-43.547983.parquet
- split: 2023_09_25T10_06_12.822356
path:
- results_2023-09-25T10-06-12.822356.parquet
- split: 2023_09_25T11_16_10.146827
path:
- results_2023-09-25T11-16-10.146827.parquet
- split: 2023_09_25T11_28_53.879118
path:
- results_2023-09-25T11-28-53.879118.parquet
- split: 2023_09_25T13_20_00.898508
path:
- results_2023-09-25T13-20-00.898508.parquet
- split: 2023_10_23T17_29_05.444286
path:
- results_2023-10-23T17-29-05.444286.parquet
- split: 2023_10_24T10_17_51.759984
path:
- results_2023-10-24T10-17-51.759984.parquet
- split: latest
path:
- results_2023-10-24T10-17-51.759984.parquet
---
# Dataset Card for Evaluation run of tiiuae/falcon-180B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/tiiuae/falcon-180B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [tiiuae/falcon-180B](https://huggingface.co/tiiuae/falcon-180B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 66 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 32 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_tiiuae__falcon-180B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T10:17:51.759984](https://huggingface.co/datasets/open-llm-leaderboard/details_tiiuae__falcon-180B/blob/main/results_2023-10-24T10-17-51.759984.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0028313758389261743,
"em_stderr": 0.0005441551135493806,
"f1": 0.06573301174496615,
"f1_stderr": 0.0013666874377791776,
"acc": 0.6642104078991223,
"acc_stderr": 0.011605139145295384
},
"harness|drop|3": {
"em": 0.0028313758389261743,
"em_stderr": 0.0005441551135493806,
"f1": 0.06573301174496615,
"f1_stderr": 0.0013666874377791776
},
"harness|gsm8k|5": {
"acc": 0.45943896891584535,
"acc_stderr": 0.01372709301042978
},
"harness|winogrande|5": {
"acc": 0.8689818468823993,
"acc_stderr": 0.009483185280160986
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Wanfq/KCA_data | ---
license: cc-by-nc-4.0
language:
- en
--- |
open-llm-leaderboard/details_Zardos__Kant-Test-0.1-Mistral-7B | ---
pretty_name: Evaluation run of Zardos/Kant-Test-0.1-Mistral-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Zardos/Kant-Test-0.1-Mistral-7B](https://huggingface.co/Zardos/Kant-Test-0.1-Mistral-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Zardos__Kant-Test-0.1-Mistral-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-10T11:05:46.345175](https://huggingface.co/datasets/open-llm-leaderboard/details_Zardos__Kant-Test-0.1-Mistral-7B/blob/main/results_2023-12-10T11-05-46.345175.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6253667303213345,\n\
\ \"acc_stderr\": 0.03253315196101968,\n \"acc_norm\": 0.6318205791064505,\n\
\ \"acc_norm_stderr\": 0.033203023073407084,\n \"mc1\": 0.3402692778457772,\n\
\ \"mc1_stderr\": 0.016586304901762557,\n \"mc2\": 0.4940424629624919,\n\
\ \"mc2_stderr\": 0.014891468326851799\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5844709897610921,\n \"acc_stderr\": 0.014401366641216388,\n\
\ \"acc_norm\": 0.6177474402730375,\n \"acc_norm_stderr\": 0.014200454049979275\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6352320254929297,\n\
\ \"acc_stderr\": 0.004803812631994957,\n \"acc_norm\": 0.828918542123083,\n\
\ \"acc_norm_stderr\": 0.0037581050431501257\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.038781398887976104,\n\
\ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.038781398887976104\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.6641509433962264,\n \"acc_stderr\": 0.02906722014664483,\n \
\ \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.02906722014664483\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n\
\ \"acc_stderr\": 0.03750757044895536,\n \"acc_norm\": 0.5895953757225434,\n\
\ \"acc_norm_stderr\": 0.03750757044895536\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105652,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105652\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944433,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944433\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7483870967741936,\n\
\ \"acc_stderr\": 0.024685979286239963,\n \"acc_norm\": 0.7483870967741936,\n\
\ \"acc_norm_stderr\": 0.024685979286239963\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6256410256410256,\n \"acc_stderr\": 0.0245375915728305,\n \
\ \"acc_norm\": 0.6256410256410256,\n \"acc_norm_stderr\": 0.0245375915728305\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616258,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616258\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \
\ \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8036697247706422,\n \"acc_stderr\": 0.01703071933915435,\n \"\
acc_norm\": 0.8036697247706422,\n \"acc_norm_stderr\": 0.01703071933915435\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.02933116229425174,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.02933116229425174\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077812,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077812\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n\
\ \"acc_stderr\": 0.01403694585038139,\n \"acc_norm\": 0.80970625798212,\n\
\ \"acc_norm_stderr\": 0.01403694585038139\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n\
\ \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3642458100558659,\n\
\ \"acc_stderr\": 0.016094338768474596,\n \"acc_norm\": 0.3642458100558659,\n\
\ \"acc_norm_stderr\": 0.016094338768474596\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729484,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729484\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818774,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818774\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799208,\n\
\ \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799208\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4511082138200782,\n\
\ \"acc_stderr\": 0.012709037347346233,\n \"acc_norm\": 0.4511082138200782,\n\
\ \"acc_norm_stderr\": 0.012709037347346233\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824873,\n\
\ \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824873\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687492,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687492\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784603,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784603\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3402692778457772,\n\
\ \"mc1_stderr\": 0.016586304901762557,\n \"mc2\": 0.4940424629624919,\n\
\ \"mc2_stderr\": 0.014891468326851799\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7853196527229677,\n \"acc_stderr\": 0.011539912734345396\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3115996967399545,\n \
\ \"acc_stderr\": 0.012757375376754941\n }\n}\n```"
repo_url: https://huggingface.co/Zardos/Kant-Test-0.1-Mistral-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|arc:challenge|25_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|arc:challenge|25_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|arc:challenge|25_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|gsm8k|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|gsm8k|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|gsm8k|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hellaswag|10_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hellaswag|10_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hellaswag|10_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T19-34-29.855469.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T19-45-27.448654.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T11-05-46.345175.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-10T11-05-46.345175.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- '**/details_harness|winogrande|5_2023-12-09T19-34-29.855469.parquet'
- split: 2023_12_09T19_45_27.448654
path:
- '**/details_harness|winogrande|5_2023-12-09T19-45-27.448654.parquet'
- split: 2023_12_10T11_05_46.345175
path:
- '**/details_harness|winogrande|5_2023-12-10T11-05-46.345175.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-10T11-05-46.345175.parquet'
- config_name: results
data_files:
- split: 2023_12_09T19_34_29.855469
path:
- results_2023-12-09T19-34-29.855469.parquet
- split: 2023_12_09T19_45_27.448654
path:
- results_2023-12-09T19-45-27.448654.parquet
- split: 2023_12_10T11_05_46.345175
path:
- results_2023-12-10T11-05-46.345175.parquet
- split: latest
path:
- results_2023-12-10T11-05-46.345175.parquet
---
# Dataset Card for Evaluation run of Zardos/Kant-Test-0.1-Mistral-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Zardos/Kant-Test-0.1-Mistral-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Zardos/Kant-Test-0.1-Mistral-7B](https://huggingface.co/Zardos/Kant-Test-0.1-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Zardos__Kant-Test-0.1-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T11:05:46.345175](https://huggingface.co/datasets/open-llm-leaderboard/details_Zardos__Kant-Test-0.1-Mistral-7B/blob/main/results_2023-12-10T11-05-46.345175.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6253667303213345,
"acc_stderr": 0.03253315196101968,
"acc_norm": 0.6318205791064505,
"acc_norm_stderr": 0.033203023073407084,
"mc1": 0.3402692778457772,
"mc1_stderr": 0.016586304901762557,
"mc2": 0.4940424629624919,
"mc2_stderr": 0.014891468326851799
},
"harness|arc:challenge|25": {
"acc": 0.5844709897610921,
"acc_stderr": 0.014401366641216388,
"acc_norm": 0.6177474402730375,
"acc_norm_stderr": 0.014200454049979275
},
"harness|hellaswag|10": {
"acc": 0.6352320254929297,
"acc_stderr": 0.004803812631994957,
"acc_norm": 0.828918542123083,
"acc_norm_stderr": 0.0037581050431501257
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.038781398887976104,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.038781398887976104
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6641509433962264,
"acc_stderr": 0.02906722014664483,
"acc_norm": 0.6641509433962264,
"acc_norm_stderr": 0.02906722014664483
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895536,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895536
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105652,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105652
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944433,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.024685979286239963,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.024685979286239963
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6256410256410256,
"acc_stderr": 0.0245375915728305,
"acc_norm": 0.6256410256410256,
"acc_norm_stderr": 0.0245375915728305
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616258,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616258
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.03135709599613591,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.03135709599613591
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8036697247706422,
"acc_stderr": 0.01703071933915435,
"acc_norm": 0.8036697247706422,
"acc_norm_stderr": 0.01703071933915435
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.02933116229425174,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.02933116229425174
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077812,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077812
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.80970625798212,
"acc_stderr": 0.01403694585038139,
"acc_norm": 0.80970625798212,
"acc_norm_stderr": 0.01403694585038139
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.024818350129436593,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.024818350129436593
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3642458100558659,
"acc_stderr": 0.016094338768474596,
"acc_norm": 0.3642458100558659,
"acc_norm_stderr": 0.016094338768474596
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.025261691219729484,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.025261691219729484
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818774,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818774
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.025006469755799208,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.025006469755799208
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4511082138200782,
"acc_stderr": 0.012709037347346233,
"acc_norm": 0.4511082138200782,
"acc_norm_stderr": 0.012709037347346233
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.028959755196824873,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.028959755196824873
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687492,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687492
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784603,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3402692778457772,
"mc1_stderr": 0.016586304901762557,
"mc2": 0.4940424629624919,
"mc2_stderr": 0.014891468326851799
},
"harness|winogrande|5": {
"acc": 0.7853196527229677,
"acc_stderr": 0.011539912734345396
},
"harness|gsm8k|5": {
"acc": 0.3115996967399545,
"acc_stderr": 0.012757375376754941
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
venkat-srinivasan-nexusflow/multiapi_prototype_CVECPE_Only | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prediction
dtype: string
- name: ground_truth
dtype: string
- name: correctness
dtype: int64
splits:
- name: split_20231020_172523
num_bytes: 23946
num_examples: 78
- name: split_20231019_234916
num_bytes: 23946
num_examples: 78
download_size: 28725
dataset_size: 47892
---
# Dataset Card for "multiapi_prototype_CVECPE_Only"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_scb10x__typhoon-7b | ---
pretty_name: Evaluation run of scb10x/typhoon-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [scb10x/typhoon-7b](https://huggingface.co/scb10x/typhoon-7b) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_scb10x__typhoon-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-29T23:54:04.797945](https://huggingface.co/datasets/open-llm-leaderboard/details_scb10x__typhoon-7b/blob/main/results_2023-12-29T23-54-04.797945.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5930329978732358,\n\
\ \"acc_stderr\": 0.03323235696154905,\n \"acc_norm\": 0.5989902156875104,\n\
\ \"acc_norm_stderr\": 0.03392042546889035,\n \"mc1\": 0.2594859241126071,\n\
\ \"mc1_stderr\": 0.01534540948555798,\n \"mc2\": 0.4052198339452636,\n\
\ \"mc2_stderr\": 0.014069431569242152\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5435153583617748,\n \"acc_stderr\": 0.01455594976049644,\n\
\ \"acc_norm\": 0.5853242320819113,\n \"acc_norm_stderr\": 0.014397070564409174\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6101374228241386,\n\
\ \"acc_stderr\": 0.004867221634461272,\n \"acc_norm\": 0.8154750049790879,\n\
\ \"acc_norm_stderr\": 0.0038711896202760685\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5481481481481482,\n\
\ \"acc_stderr\": 0.042992689054808644,\n \"acc_norm\": 0.5481481481481482,\n\
\ \"acc_norm_stderr\": 0.042992689054808644\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.038947344870133176,\n\
\ \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.038947344870133176\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.02964781353936524,\n\
\ \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.02964781353936524\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266346,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266346\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.03268335899936336,\n\
\ \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.03268335899936336\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246483,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246483\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.041905964388711366,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.041905964388711366\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7064516129032258,\n\
\ \"acc_stderr\": 0.02590608702131929,\n \"acc_norm\": 0.7064516129032258,\n\
\ \"acc_norm_stderr\": 0.02590608702131929\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175007,\n\
\ \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175007\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091706,\n\
\ \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091706\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365907,\n \"\
acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365907\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.02614848346915331,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.02614848346915331\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6051282051282051,\n \"acc_stderr\": 0.02478431694215638,\n \
\ \"acc_norm\": 0.6051282051282051,\n \"acc_norm_stderr\": 0.02478431694215638\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606649,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606649\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5798319327731093,\n \"acc_stderr\": 0.03206183783236152,\n \
\ \"acc_norm\": 0.5798319327731093,\n \"acc_norm_stderr\": 0.03206183783236152\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7706422018348624,\n \"acc_stderr\": 0.018025349724618684,\n \"\
acc_norm\": 0.7706422018348624,\n \"acc_norm_stderr\": 0.018025349724618684\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538271,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538271\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7352941176470589,\n \"acc_stderr\": 0.030964517926923393,\n \"\
acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.030964517926923393\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842548,\n \
\ \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842548\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.02390232554956039,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.02390232554956039\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7867177522349936,\n\
\ \"acc_stderr\": 0.014648172749593524,\n \"acc_norm\": 0.7867177522349936,\n\
\ \"acc_norm_stderr\": 0.014648172749593524\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.02541600377316555,\n\
\ \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.02541600377316555\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3106145251396648,\n\
\ \"acc_stderr\": 0.015476515438005566,\n \"acc_norm\": 0.3106145251396648,\n\
\ \"acc_norm_stderr\": 0.015476515438005566\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.0264930332251459,\n\
\ \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.0264930332251459\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n\
\ \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n\
\ \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.654320987654321,\n \"acc_stderr\": 0.02646248777700187,\n\
\ \"acc_norm\": 0.654320987654321,\n \"acc_norm_stderr\": 0.02646248777700187\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4397163120567376,\n \"acc_stderr\": 0.029609912075594106,\n \
\ \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.029609912075594106\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4198174706649283,\n\
\ \"acc_stderr\": 0.012604960816087375,\n \"acc_norm\": 0.4198174706649283,\n\
\ \"acc_norm_stderr\": 0.012604960816087375\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5845588235294118,\n \"acc_stderr\": 0.02993534270787774,\n\
\ \"acc_norm\": 0.5845588235294118,\n \"acc_norm_stderr\": 0.02993534270787774\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6372549019607843,\n \"acc_stderr\": 0.01945076843250551,\n \
\ \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.01945076843250551\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784603,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784603\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n\
\ \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n\
\ \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"\
acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.03061111655743253,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.03061111655743253\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2594859241126071,\n\
\ \"mc1_stderr\": 0.01534540948555798,\n \"mc2\": 0.4052198339452636,\n\
\ \"mc2_stderr\": 0.014069431569242152\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7655880031570639,\n \"acc_stderr\": 0.011906130106237992\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3161485974222896,\n \
\ \"acc_stderr\": 0.012807630673451495\n }\n}\n```"
repo_url: https://huggingface.co/scb10x/typhoon-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|arc:challenge|25_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|gsm8k|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hellaswag|10_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T23-54-04.797945.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T23-54-04.797945.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- '**/details_harness|winogrande|5_2023-12-29T23-54-04.797945.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-29T23-54-04.797945.parquet'
- config_name: results
data_files:
- split: 2023_12_29T23_54_04.797945
path:
- results_2023-12-29T23-54-04.797945.parquet
- split: latest
path:
- results_2023-12-29T23-54-04.797945.parquet
---
# Dataset Card for Evaluation run of scb10x/typhoon-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [scb10x/typhoon-7b](https://huggingface.co/scb10x/typhoon-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_scb10x__typhoon-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T23:54:04.797945](https://huggingface.co/datasets/open-llm-leaderboard/details_scb10x__typhoon-7b/blob/main/results_2023-12-29T23-54-04.797945.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5930329978732358,
"acc_stderr": 0.03323235696154905,
"acc_norm": 0.5989902156875104,
"acc_norm_stderr": 0.03392042546889035,
"mc1": 0.2594859241126071,
"mc1_stderr": 0.01534540948555798,
"mc2": 0.4052198339452636,
"mc2_stderr": 0.014069431569242152
},
"harness|arc:challenge|25": {
"acc": 0.5435153583617748,
"acc_stderr": 0.01455594976049644,
"acc_norm": 0.5853242320819113,
"acc_norm_stderr": 0.014397070564409174
},
"harness|hellaswag|10": {
"acc": 0.6101374228241386,
"acc_stderr": 0.004867221634461272,
"acc_norm": 0.8154750049790879,
"acc_norm_stderr": 0.0038711896202760685
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5481481481481482,
"acc_stderr": 0.042992689054808644,
"acc_norm": 0.5481481481481482,
"acc_norm_stderr": 0.042992689054808644
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.038947344870133176,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.038947344870133176
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6339622641509434,
"acc_stderr": 0.02964781353936524,
"acc_norm": 0.6339622641509434,
"acc_norm_stderr": 0.02964781353936524
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266346,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.49361702127659574,
"acc_stderr": 0.03268335899936336,
"acc_norm": 0.49361702127659574,
"acc_norm_stderr": 0.03268335899936336
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.025197101074246483,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.025197101074246483
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.041905964388711366,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.041905964388711366
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7064516129032258,
"acc_stderr": 0.02590608702131929,
"acc_norm": 0.7064516129032258,
"acc_norm_stderr": 0.02590608702131929
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03588624800091706,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03588624800091706
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365907,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365907
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.02614848346915331,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.02614848346915331
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6051282051282051,
"acc_stderr": 0.02478431694215638,
"acc_norm": 0.6051282051282051,
"acc_norm_stderr": 0.02478431694215638
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606649,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606649
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5798319327731093,
"acc_stderr": 0.03206183783236152,
"acc_norm": 0.5798319327731093,
"acc_norm_stderr": 0.03206183783236152
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7706422018348624,
"acc_stderr": 0.018025349724618684,
"acc_norm": 0.7706422018348624,
"acc_norm_stderr": 0.018025349724618684
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.030964517926923393,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.030964517926923393
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7215189873417721,
"acc_stderr": 0.029178682304842548,
"acc_norm": 0.7215189873417721,
"acc_norm_stderr": 0.029178682304842548
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.02390232554956039,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.02390232554956039
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7867177522349936,
"acc_stderr": 0.014648172749593524,
"acc_norm": 0.7867177522349936,
"acc_norm_stderr": 0.014648172749593524
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.02541600377316555,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.02541600377316555
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3106145251396648,
"acc_stderr": 0.015476515438005566,
"acc_norm": 0.3106145251396648,
"acc_norm_stderr": 0.015476515438005566
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.0264930332251459,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.0264930332251459
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6527331189710611,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.6527331189710611,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.654320987654321,
"acc_stderr": 0.02646248777700187,
"acc_norm": 0.654320987654321,
"acc_norm_stderr": 0.02646248777700187
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.029609912075594106,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.029609912075594106
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4198174706649283,
"acc_stderr": 0.012604960816087375,
"acc_norm": 0.4198174706649283,
"acc_norm_stderr": 0.012604960816087375
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5845588235294118,
"acc_stderr": 0.02993534270787774,
"acc_norm": 0.5845588235294118,
"acc_norm_stderr": 0.02993534270787774
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6372549019607843,
"acc_stderr": 0.01945076843250551,
"acc_norm": 0.6372549019607843,
"acc_norm_stderr": 0.01945076843250551
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784603,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.03061111655743253,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.03061111655743253
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2594859241126071,
"mc1_stderr": 0.01534540948555798,
"mc2": 0.4052198339452636,
"mc2_stderr": 0.014069431569242152
},
"harness|winogrande|5": {
"acc": 0.7655880031570639,
"acc_stderr": 0.011906130106237992
},
"harness|gsm8k|5": {
"acc": 0.3161485974222896,
"acc_stderr": 0.012807630673451495
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
TheDuyx/bass_data | ---
dataset_info:
features:
- name: label
dtype:
class_label:
names:
'0': '808'
'1': brass
'2': growl
'3': jump_up
'4': reese
'5': slap
'6': sub
'7': whomp
- name: input_values
sequence: float32
- name: attention_mask
sequence: int32
splits:
- name: train
num_bytes: 3588435736
num_examples: 33388
- name: test
num_bytes: 399348632
num_examples: 3710
download_size: 1893499877
dataset_size: 3987784368
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
yuvalkirstain/yuvalkirstain-pickapic-ft-eval-random-prompts | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 31392
num_examples: 200
download_size: 11259
dataset_size: 31392
---
# Dataset Card for "yuvalkirstain-pickapic-ft-eval-random-prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rombodawg/airoboros-2.1_general_purpose | ---
license: apache-2.0
---
This is the airoboros-2.1 datatset simplified amd generalized to be usable with any ai model.
Original dataset bellow:
- https://huggingface.co/datasets/jondurbin/airoboros-2.1 |
yashraizad/yelp-open-dataset-users | ---
license: apache-2.0
---
|
sexoeprazer/athena001 | ---
license: openrail
---
|
TheFinAI/flare-es-efpa | ---
dataset_info:
features:
- name: 'query:'
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
- name: choices
sequence: string
- name: gold
dtype: int64
- name: query
dtype: string
splits:
- name: test
num_bytes: 353055
num_examples: 228
download_size: 141839
dataset_size: 353055
---
# Dataset Card for "flare-es-efpa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_SC44__Mistral-7B-private-spnf | ---
pretty_name: Evaluation run of SC44/Mistral-7B-private-spnf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [SC44/Mistral-7B-private-spnf](https://huggingface.co/SC44/Mistral-7B-private-spnf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SC44__Mistral-7B-private-spnf\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-28T06:39:45.852747](https://huggingface.co/datasets/open-llm-leaderboard/details_SC44__Mistral-7B-private-spnf/blob/main/results_2024-01-28T06-39-45.852747.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6081076853099489,\n\
\ \"acc_stderr\": 0.03312456837122671,\n \"acc_norm\": 0.6126302396586892,\n\
\ \"acc_norm_stderr\": 0.033796266236967715,\n \"mc1\": 0.5299877600979193,\n\
\ \"mc1_stderr\": 0.01747199209169754,\n \"mc2\": 0.6834378173026425,\n\
\ \"mc2_stderr\": 0.015179197426716372\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5895904436860068,\n \"acc_stderr\": 0.014374922192642664,\n\
\ \"acc_norm\": 0.6305460750853242,\n \"acc_norm_stderr\": 0.014104578366491888\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6685919139613623,\n\
\ \"acc_stderr\": 0.004697573962169424,\n \"acc_norm\": 0.8490340569607648,\n\
\ \"acc_norm_stderr\": 0.0035728399695219935\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.039531733777491945,\n\
\ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.039531733777491945\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n\
\ \"acc_stderr\": 0.037507570448955356,\n \"acc_norm\": 0.5895953757225434,\n\
\ \"acc_norm_stderr\": 0.037507570448955356\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.04615186962583703,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.04615186962583703\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n\
\ \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.38095238095238093,\n \"acc_stderr\": 0.025010749116137602,\n \"\
acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.025010749116137602\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.635483870967742,\n\
\ \"acc_stderr\": 0.027379871229943245,\n \"acc_norm\": 0.635483870967742,\n\
\ \"acc_norm_stderr\": 0.027379871229943245\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198896,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198896\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306443,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5615384615384615,\n \"acc_stderr\": 0.02515826601686858,\n \
\ \"acc_norm\": 0.5615384615384615,\n \"acc_norm_stderr\": 0.02515826601686858\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114993,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114993\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121626,\n\
\ \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121626\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7944954128440367,\n \"acc_stderr\": 0.01732435232501601,\n \"\
acc_norm\": 0.7944954128440367,\n \"acc_norm_stderr\": 0.01732435232501601\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145624,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145624\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n\
\ \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n\
\ \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597552,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597552\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7790549169859514,\n\
\ \"acc_stderr\": 0.014836205167333557,\n \"acc_norm\": 0.7790549169859514,\n\
\ \"acc_norm_stderr\": 0.014836205167333557\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917205,\n\
\ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917205\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3206703910614525,\n\
\ \"acc_stderr\": 0.015609929559348406,\n \"acc_norm\": 0.3206703910614525,\n\
\ \"acc_norm_stderr\": 0.015609929559348406\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.02671611838015685,\n\
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.02671611838015685\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6975308641975309,\n \"acc_stderr\": 0.025557653981868045,\n\
\ \"acc_norm\": 0.6975308641975309,\n \"acc_norm_stderr\": 0.025557653981868045\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291463,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291463\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4315514993481095,\n\
\ \"acc_stderr\": 0.012650007999463872,\n \"acc_norm\": 0.4315514993481095,\n\
\ \"acc_norm_stderr\": 0.012650007999463872\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6213235294117647,\n \"acc_stderr\": 0.02946513363977613,\n\
\ \"acc_norm\": 0.6213235294117647,\n \"acc_norm_stderr\": 0.02946513363977613\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6339869281045751,\n \"acc_stderr\": 0.019488025745529675,\n \
\ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.019488025745529675\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304328,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304328\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n\
\ \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.7512437810945274,\n\
\ \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333047,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333047\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5299877600979193,\n\
\ \"mc1_stderr\": 0.01747199209169754,\n \"mc2\": 0.6834378173026425,\n\
\ \"mc2_stderr\": 0.015179197426716372\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7742699289660616,\n \"acc_stderr\": 0.011749626260902547\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.39651250947687644,\n \
\ \"acc_stderr\": 0.013474258584033345\n }\n}\n```"
repo_url: https://huggingface.co/SC44/Mistral-7B-private-spnf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|arc:challenge|25_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|arc:challenge|25_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|gsm8k|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|gsm8k|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hellaswag|10_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hellaswag|10_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T06-36-37.050829.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T06-39-45.852747.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T06-39-45.852747.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- '**/details_harness|winogrande|5_2024-01-28T06-36-37.050829.parquet'
- split: 2024_01_28T06_39_45.852747
path:
- '**/details_harness|winogrande|5_2024-01-28T06-39-45.852747.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-28T06-39-45.852747.parquet'
- config_name: results
data_files:
- split: 2024_01_28T06_36_37.050829
path:
- results_2024-01-28T06-36-37.050829.parquet
- split: 2024_01_28T06_39_45.852747
path:
- results_2024-01-28T06-39-45.852747.parquet
- split: latest
path:
- results_2024-01-28T06-39-45.852747.parquet
---
# Dataset Card for Evaluation run of SC44/Mistral-7B-private-spnf
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SC44/Mistral-7B-private-spnf](https://huggingface.co/SC44/Mistral-7B-private-spnf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SC44__Mistral-7B-private-spnf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-28T06:39:45.852747](https://huggingface.co/datasets/open-llm-leaderboard/details_SC44__Mistral-7B-private-spnf/blob/main/results_2024-01-28T06-39-45.852747.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6081076853099489,
"acc_stderr": 0.03312456837122671,
"acc_norm": 0.6126302396586892,
"acc_norm_stderr": 0.033796266236967715,
"mc1": 0.5299877600979193,
"mc1_stderr": 0.01747199209169754,
"mc2": 0.6834378173026425,
"mc2_stderr": 0.015179197426716372
},
"harness|arc:challenge|25": {
"acc": 0.5895904436860068,
"acc_stderr": 0.014374922192642664,
"acc_norm": 0.6305460750853242,
"acc_norm_stderr": 0.014104578366491888
},
"harness|hellaswag|10": {
"acc": 0.6685919139613623,
"acc_stderr": 0.004697573962169424,
"acc_norm": 0.8490340569607648,
"acc_norm_stderr": 0.0035728399695219935
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.039531733777491945,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.039531733777491945
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.037507570448955356,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.037507570448955356
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.04615186962583703,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.04615186962583703
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.025010749116137602,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.025010749116137602
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.635483870967742,
"acc_stderr": 0.027379871229943245,
"acc_norm": 0.635483870967742,
"acc_norm_stderr": 0.027379871229943245
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198896,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198896
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306443,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5615384615384615,
"acc_stderr": 0.02515826601686858,
"acc_norm": 0.5615384615384615,
"acc_norm_stderr": 0.02515826601686858
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114993,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114993
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7944954128440367,
"acc_stderr": 0.01732435232501601,
"acc_norm": 0.7944954128440367,
"acc_norm_stderr": 0.01732435232501601
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145624,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145624
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.047184714852195886,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.047184714852195886
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597552,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597552
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7790549169859514,
"acc_stderr": 0.014836205167333557,
"acc_norm": 0.7790549169859514,
"acc_norm_stderr": 0.014836205167333557
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.024752411960917205,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.024752411960917205
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3206703910614525,
"acc_stderr": 0.015609929559348406,
"acc_norm": 0.3206703910614525,
"acc_norm_stderr": 0.015609929559348406
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.02671611838015685,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.02671611838015685
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6975308641975309,
"acc_stderr": 0.025557653981868045,
"acc_norm": 0.6975308641975309,
"acc_norm_stderr": 0.025557653981868045
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291463,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291463
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4315514993481095,
"acc_stderr": 0.012650007999463872,
"acc_norm": 0.4315514993481095,
"acc_norm_stderr": 0.012650007999463872
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6213235294117647,
"acc_stderr": 0.02946513363977613,
"acc_norm": 0.6213235294117647,
"acc_norm_stderr": 0.02946513363977613
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.019488025745529675,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.019488025745529675
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304328,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304328
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916714,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916714
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333047,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333047
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5299877600979193,
"mc1_stderr": 0.01747199209169754,
"mc2": 0.6834378173026425,
"mc2_stderr": 0.015179197426716372
},
"harness|winogrande|5": {
"acc": 0.7742699289660616,
"acc_stderr": 0.011749626260902547
},
"harness|gsm8k|5": {
"acc": 0.39651250947687644,
"acc_stderr": 0.013474258584033345
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mirfan899/jalandhary_asr | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
splits:
- name: train
num_bytes: 2710979825.076
num_examples: 10287
download_size: 2711170960
dataset_size: 2710979825.076
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
lexklima/locbp | ---
license: creativeml-openrail-m
---
|
cheafdevo56/influential_citations_triplets | ---
dataset_info:
features:
- name: query
struct:
- name: abstract
dtype: string
- name: corpus_id
dtype: int64
- name: title
dtype: string
- name: pos
struct:
- name: abstract
dtype: string
- name: corpus_id
dtype: int64
- name: title
dtype: string
- name: neg
struct:
- name: abstract
dtype: string
- name: corpus_id
dtype: int64
- name: score
dtype: int64
- name: title
dtype: string
splits:
- name: train
num_bytes: 76694722
num_examples: 20141
download_size: 45629285
dataset_size: 76694722
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Lickilis/Likilis | ---
license: openrail
---
|
DBQ/Louis.Vuitton.Product.prices.France | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- unknown
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
- image-classification
- feature-extraction
- image-segmentation
- image-to-image
- image-to-text
- object-detection
- summarization
- zero-shot-image-classification
pretty_name: France - Louis Vuitton - Product-level price list
tags:
- webscraping
- ecommerce
- Louis Vuitton
- fashion
- fashion product
- image
- fashion image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: website_name
dtype: string
- name: competence_date
dtype: string
- name: country_code
dtype: string
- name: currency_code
dtype: string
- name: brand
dtype: string
- name: category1_code
dtype: string
- name: category2_code
dtype: string
- name: category3_code
dtype: string
- name: product_code
dtype: string
- name: title
dtype: string
- name: itemurl
dtype: string
- name: imageurl
dtype: string
- name: full_price
dtype: float64
- name: price
dtype: float64
- name: full_price_eur
dtype: float64
- name: price_eur
dtype: float64
- name: flg_discount
dtype: int64
splits:
- name: train
num_bytes: 3471507
num_examples: 7806
download_size: 915905
dataset_size: 3471507
---
# Louis Vuitton web scraped data
## About the website
The **Luxury Fashion** industry in the **EMEA** region, specifically in **France**, is a competitive and dynamic sector representing some of the most prestigious names in fashion. **Louis Vuitton**, a prominent player, is renowned for its high-end products, setting the tone for luxury retail within and beyond France. The recent rise in digital transformation has intensified the focus on **Ecommerce** within this industry. The dataset in review contains **Ecommerce product-list page (PLP) data** on Louis Vuitton in France. This data provides valuable insights into consumer behaviour, purchasing patterns, product preferences, and overall performance metrics of Louis Vuittons online retail.
## Link to **dataset**
[France - Louis Vuitton - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Louis%20Vuitton%20Product-prices%20France/r/recuJmM9pHnEywNtQ)
|
rw-r-r-0644/assignment-2 | ---
license: cc-by-2.0
task_categories:
- fill-mask
- text-classification
- text2text-generation
language:
- en
size_categories:
- 10K<n<100K
configs:
- config_name: default
data_files:
- split: train
path: "train.jsonl"
- split: validation
path: "validation.jsonl"
- split: test
path: "test.jsonl"
---
# assignment-2
Dataset for the second assignment of the Deep Learning course at my university.
The dataset appears to be based on the [WinoGrande](https://winogrande.allenai.org) train_xl set (for the original training.jsonl) and dev/validation set (for test.jsonl).
The provided training set has been randomly split into training (85%) and validation (15%) sets.
### Citation Information
```
@InProceedings{ai2:winogrande,
title = {WinoGrande: An Adversarial Winograd Schema Challenge at Scale},
authors={Keisuke, Sakaguchi and Ronan, Le Bras and Chandra, Bhagavatula and Yejin, Choi
},
year={2019}
}
```
### Credits
Credits to allenai and the authors of the WinoGrande dataset.
The original dataset repository can be found on [github](https://github.com/allenai/winogrande).
|
Flyfer/testDataset | ---
license: apache-2.0
---
|
marcus2000/Names2chinese | ---
dataset_info:
features:
- name: English_name
dtype: string
- name: Chinese_name
dtype: string
splits:
- name: train
num_bytes: 15768
num_examples: 723
- name: test
num_bytes: 1409
num_examples: 63
download_size: 15737
dataset_size: 17177
---
# Dataset Card for "Names2chinese"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zolak/twitter_dataset_50_1713132027 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 240189
num_examples: 597
download_size: 129747
dataset_size: 240189
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
desertfox/hsereg2022 | ---
license: afl-3.0
---
|
pharaouk/stack-v2-python-chunk3 | ---
dataset_info:
features:
- name: repo_name
dtype: string
- name: repo_url
dtype: string
- name: snapshot_id
dtype: string
- name: revision_id
dtype: string
- name: directory_id
dtype: string
- name: branch_name
dtype: string
- name: visit_date
dtype: timestamp[ns]
- name: revision_date
dtype: timestamp[ns]
- name: committer_date
dtype: timestamp[ns]
- name: github_id
dtype: int64
- name: star_events_count
dtype: int64
- name: fork_events_count
dtype: int64
- name: gha_license_id
dtype: string
- name: gha_created_at
dtype: timestamp[ns]
- name: gha_updated_at
dtype: timestamp[ns]
- name: gha_pushed_at
dtype: timestamp[ns]
- name: gha_language
dtype: string
- name: files
list:
- name: blob_id
dtype: string
- name: path
dtype: string
- name: content_id
dtype: string
- name: language
dtype: string
- name: length_bytes
dtype: int64
- name: detected_licenses
sequence: string
- name: license_type
dtype: string
- name: src_encoding
dtype: string
- name: is_vendor
dtype: bool
- name: is_generated
dtype: bool
- name: alphanum_fraction
dtype: float32
- name: alpha_fraction
dtype: float32
- name: num_lines
dtype: int32
- name: avg_line_length
dtype: float32
- name: max_line_length
dtype: int32
- name: num_files
dtype: int64
splits:
- name: train
num_bytes: 7897866927
num_examples: 2984967
download_size: 5062103145
dataset_size: 7897866927
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
TUKE-DeutscheTelekom/skquad | ---
annotations_creators:
- crowdsourced
language:
- sk
language_creators:
- crowdsourced
- found
license:
- cc-by-sa-4.0
- cc-by-4.0
multilinguality:
- monolingual
paperswithcode_id: squad
pretty_name: skquad
size_categories:
- 10K<n<100K
source_datasets:
- original
tags:
- wikipedia
task_categories:
- question-answering
- text-retrieval
task_ids:
- open-domain-qa
- extractive-qa
- document-retrieval
train-eval-index:
- col_mapping:
answers:
answer_start: answer_start
text: text
context: context
question: question
config: squad_v2
metrics:
- name: SQuAD v2
type: squad_v2
splits:
eval_split: validation
train_split: train
task: question-answering
task_id: extractive_question_answering
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
SK-QuAD is the first QA dataset for the Slovak language.
It is manually annotated, so it has no distortion caused by
machine translation. The dataset is thematically diverse – it
does not overlap with SQuAD – it brings new knowledge.
It passed the second round of annotation – each question
and the answer were seen by at least two annotators.
### Supported Tasks and Leaderboards
- Question answering
- Document retrieval
### Languages
- Slovak
## Dataset Structure
#### squad_v2
- **Size of downloaded dataset files:** 44.34 MB
- **Size of the generated dataset:** 122.57 MB
- **Total amount of disk used:** 166.91 MB
-
An example of 'validation' looks as follows.
```
This example was too long and was cropped:
{
"answers": {
"answer_start": [94, 87, 94, 94],
"text": ["10th and 11th centuries", "in the 10th and 11th centuries", "10th and 11th centuries", "10th and 11th centuries"]
},
"context": "\"The Normans (Norman: Nourmands; French: Normands; Latin: Normanni) were the people who in the 10th and 11th centuries gave thei...",
"id": "56ddde6b9a695914005b9629",
"question": "When were the Normans in Normandy?",
"title": "Normans"
}
```
### Data Fields
The data fields are the same among all splits.
#### squad_v2
- `id`: a `string` feature.
- `title`: a `string` feature.
- `context`: a `string` feature.
- `question`: a `string` feature.
- `answers`: a dictionary feature containing:
- `text`: a `string` feature.
- `answer_start`: a `int32` feature.
### Data Splits
| | Train | Dev | Translated |
| ------------- | -----: | -----: | -------: |
| Documents | 8,377 | 940 | 442 |
| Paragraphs | 22,062 | 2,568 | 18,931 |
| Questions | 81,582 | 9,583 | 120,239 |
| Answers | 65,839 | 7,822 | 79,978 |
| Unanswerable | 15,877 | 1,784 | 40,261 |
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
- Deutsche Telekom Systems Solutions Slovakia
- Technical Univesity of Košice
### Licensing Information
Attribution-ShareAlike 4.0 International (CC BY-SA 4.0)
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset.
|
FreedomIntelligence/MMLU_Portuguese | ---
license: mit
language:
- pt
---
Portuguese version of MMLU dataset tranlasted by gpt-3.5-turbo.
The dataset is used in the research related to [MultilingualSIFT](https://github.com/FreedomIntelligence/MultilingualSIFT). |
mickylan2367/ZipfilePractice | ---
license: cc-by-sa-4.0
language:
- en
tags:
- music
- spectrogram
- text
- text2music
size_categories:
- 1K<n<10K
---
# Google/MusicCapsの音楽をスペクトログラムにしたデータセット
* 内容は<a href="https://huggingface.co/datasets/mickylan2367/GraySpectrogram">mickylan2367/GraySpectrogram</a>と同じです。
* ただ、このデータセットはデータ自体をzipファイルで作ったので、GraySpectrogramよりも(ちょっとだけ)ダウンロードが早いです。
## 基本情報
* sampling_rate: int = 44100
* 20秒のwavファイル -> 1600×800のpngファイルへ変換
* librosaの規格により、画像の縦軸:(0-10000?Hz), 画像の横軸:(0-40秒)
|
Sharathhebbar24/MetaMathQA | ---
dataset_info:
features:
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 314797881
num_examples: 390062
download_size: 131558400
dataset_size: 314797881
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
task_categories:
- text-generation
language:
- en
tags:
- math
size_categories:
- 10K<n<100K
---
# Meta Math Filtered
This is a combined and filtered (removed all the redundant rows) version of [meta-math/MetaMathQA](https://huggingface.co/datasets/meta-math/MetaMathQA) and [meta-math/MetaMathQA-40K](https://huggingface.co/datasets/meta-math/MetaMathQA-40K)
## Usage
```python
from datasets import load_dataset
dataset = load_dataset("Sharathhebbar24/MetaMathQA", split="train")
``` |
huggingartists/arctic-monkeys | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/arctic-monkeys"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.246691 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/12c27f4fbb06ef32dc1c1e432098f447.570x570x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/arctic-monkeys">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Arctic Monkeys</div>
<a href="https://genius.com/artists/arctic-monkeys">
<div style="text-align: center; font-size: 14px;">@arctic-monkeys</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/arctic-monkeys).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/arctic-monkeys")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|186| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/arctic-monkeys")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
georgesuper/houseDataset | ---
license: apache-2.0
---
|
mole-code/dev.langchain4j-data | ---
dataset_info:
features:
- name: code
dtype: string
- name: apis
sequence: string
- name: extract_api
dtype: string
splits:
- name: train
num_bytes: 1814444
num_examples: 329
download_size: 483919
dataset_size: 1814444
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jondurbin/rosettacode-raw | ---
license: gfdl
---
|
kristmh/low_vs_random | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validate
path: data/validate-*
dataset_info:
features:
- name: text_clean
dtype: string
- name: labels
dtype: int64
- name: class
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: test
num_bytes: 19739399
num_examples: 24512
- name: train
num_bytes: 158958252
num_examples: 196090
- name: validate
num_bytes: 19103258
num_examples: 24511
download_size: 97810741
dataset_size: 197800909
---
# Dataset Card for "low_vs_random"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
laamaai/ele-int | ---
dataset_info:
features:
- name: ELE
dtype: string
- name: INT
dtype: string
splits:
- name: train
num_bytes: 332689.5077658303
num_examples: 1339
- name: test
num_bytes: 83234.49223416965
num_examples: 335
download_size: 292949
dataset_size: 415924.0
---
# Dataset Card for "ele-int"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Harsh-7300/datagrade | ---
license: mit
---
|
memray/AugTriever-AugQ-CC | ---
license: mit
---
AugQ-CC is an unsupervised augmented dataset for training retrievers used in `AugTriever: Unsupervised Dense Retrieval by Scalable Data Augmentation`.
It consists of 52.4M pseudo query-document pairs based on [Pile-CommonCrawl](https://pile.eleuther.ai/paper.pdf).
```
@article{meng2022augtriever,
title={AugTriever: Unsupervised Dense Retrieval by Scalable Data
Augmentation},
author={Meng, Rui and Liu, Ye and Yavuz, Semih and Agarwal, Divyansh and Tu, Lifu and Yu, Ning and Zhang, Jianguo and Bhat, Meghana and Zhou, Yingbo},
journal={arXiv preprint arXiv:2212.08841},
year={2022}
}
``` |
mlxen/squad_1_1_contrastdata_training | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
splits:
- name: train
num_bytes: 79340259
num_examples: 87599
download_size: 0
dataset_size: 79340259
---
# Dataset Card for "squad_1_1_contrastdata_training"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/Chinese_Mandarin_Synthesis_Corpus-Male_Customer_Service | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Chinese_Mandarin_Synthesis_Corpus-Male_Customer_Service
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/1099?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
20 Hours - Chinese Mandarin Synthesis Corpus-Male, Customer Service. It is recorded by Chinese native speakers, the voice of the full of magnetism. The phoneme coverage is balanced. Professional phonetician participates in the annotation. It precisely matches with the research and development needs of the speech synthesis.
For more details, please refer to the link: https://www.nexdata.ai/datasets/1099?source=Huggingface
### Supported Tasks and Leaderboards
tts: The dataset can be used to train a model for Text to Speech (TTS).
### Languages
Chinese Mandarin
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions
|
open-llm-leaderboard/details_liminerity__Multiverse-Experiment-slerp-7b | ---
pretty_name: Evaluation run of liminerity/Multiverse-Experiment-slerp-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [liminerity/Multiverse-Experiment-slerp-7b](https://huggingface.co/liminerity/Multiverse-Experiment-slerp-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_liminerity__Multiverse-Experiment-slerp-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-09T12:02:01.054194](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__Multiverse-Experiment-slerp-7b/blob/main/results_2024-03-09T12-02-01.054194.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6518510677976063,\n\
\ \"acc_stderr\": 0.03203683002461246,\n \"acc_norm\": 0.6505703294705394,\n\
\ \"acc_norm_stderr\": 0.03271494006314174,\n \"mc1\": 0.631578947368421,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.7792924999444771,\n\
\ \"mc2_stderr\": 0.013713064522592473\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7167235494880546,\n \"acc_stderr\": 0.013167478735134575,\n\
\ \"acc_norm\": 0.7286689419795221,\n \"acc_norm_stderr\": 0.012993807727545796\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7177853017327226,\n\
\ \"acc_stderr\": 0.0044915745394418834,\n \"acc_norm\": 0.8914558852818164,\n\
\ \"acc_norm_stderr\": 0.003104306434972473\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055273,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055273\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250447,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250447\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834841,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834841\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546836,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546836\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42681564245810055,\n\
\ \"acc_stderr\": 0.016542401954631917,\n \"acc_norm\": 0.42681564245810055,\n\
\ \"acc_norm_stderr\": 0.016542401954631917\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4765319426336376,\n\
\ \"acc_stderr\": 0.012756161942523369,\n \"acc_norm\": 0.4765319426336376,\n\
\ \"acc_norm_stderr\": 0.012756161942523369\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.631578947368421,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.7792924999444771,\n\
\ \"mc2_stderr\": 0.013713064522592473\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8476716653512234,\n \"acc_stderr\": 0.010099208246065604\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7172100075815011,\n \
\ \"acc_stderr\": 0.012405020417873619\n }\n}\n```"
repo_url: https://huggingface.co/liminerity/Multiverse-Experiment-slerp-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|arc:challenge|25_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|gsm8k|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hellaswag|10_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T12-02-01.054194.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T12-02-01.054194.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- '**/details_harness|winogrande|5_2024-03-09T12-02-01.054194.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-09T12-02-01.054194.parquet'
- config_name: results
data_files:
- split: 2024_03_09T12_02_01.054194
path:
- results_2024-03-09T12-02-01.054194.parquet
- split: latest
path:
- results_2024-03-09T12-02-01.054194.parquet
---
# Dataset Card for Evaluation run of liminerity/Multiverse-Experiment-slerp-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [liminerity/Multiverse-Experiment-slerp-7b](https://huggingface.co/liminerity/Multiverse-Experiment-slerp-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_liminerity__Multiverse-Experiment-slerp-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-09T12:02:01.054194](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__Multiverse-Experiment-slerp-7b/blob/main/results_2024-03-09T12-02-01.054194.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6518510677976063,
"acc_stderr": 0.03203683002461246,
"acc_norm": 0.6505703294705394,
"acc_norm_stderr": 0.03271494006314174,
"mc1": 0.631578947368421,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.7792924999444771,
"mc2_stderr": 0.013713064522592473
},
"harness|arc:challenge|25": {
"acc": 0.7167235494880546,
"acc_stderr": 0.013167478735134575,
"acc_norm": 0.7286689419795221,
"acc_norm_stderr": 0.012993807727545796
},
"harness|hellaswag|10": {
"acc": 0.7177853017327226,
"acc_stderr": 0.0044915745394418834,
"acc_norm": 0.8914558852818164,
"acc_norm_stderr": 0.003104306434972473
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493864,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055273,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055273
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603491,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603491
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250447,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250447
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834841,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834841
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546836,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546836
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42681564245810055,
"acc_stderr": 0.016542401954631917,
"acc_norm": 0.42681564245810055,
"acc_norm_stderr": 0.016542401954631917
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4765319426336376,
"acc_stderr": 0.012756161942523369,
"acc_norm": 0.4765319426336376,
"acc_norm_stderr": 0.012756161942523369
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.631578947368421,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.7792924999444771,
"mc2_stderr": 0.013713064522592473
},
"harness|winogrande|5": {
"acc": 0.8476716653512234,
"acc_stderr": 0.010099208246065604
},
"harness|gsm8k|5": {
"acc": 0.7172100075815011,
"acc_stderr": 0.012405020417873619
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
toshi456/Gemma-Alpaca-Data-13k | ---
license: apache-2.0
task_categories:
- question-answering
language:
- en
size_categories:
- 10K<n<100K
---
# Dataset Card for "Gemma-Alpaca-Data-13k"
## Dataset Detail
**Dataset Type:** Gemma-Alpaca-Data-13k is generated automatically using [google/gemma-7b-it](https://huggingface.co/google/gemma-7b-it) with reference to the data generation method in [Stanford Alpaca](https://crfm.stanford.edu/2023/03/13/alpaca.html).
**Resources for More Information:** Preparing
**License:** [Apache license 2.0](https://www.apache.org/licenses/LICENSE-2.0)
**Questions or Comments:**
## Acknowledgement
- [Stanford Alpaca](https://crfm.stanford.edu/2023/03/13/alpaca.html)
- [Gemma](https://ai.google.dev/gemma/docs?hl=en) |
Affaan/CustomOpenOrca | ---
language:
- en
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 1791415
num_examples: 1000
download_size: 1023995
dataset_size: 1791415
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gagan3012/oasis | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 6962220398
num_examples: 2344376
download_size: 3410521074
dataset_size: 6962220398
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_225 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1123778940.0
num_examples: 220695
download_size: 1148347695
dataset_size: 1123778940.0
---
# Dataset Card for "chunk_225"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
virtualvoidsteve/virtualvoidsteve | ---
dataset_info:
features:
- name: corrupted
dtype: string
- name: corrected
dtype: string
splits:
- name: train
num_bytes: 58027
num_examples: 57
download_size: 17480
dataset_size: 58027
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ThiagoKRHS/ricardojuarez | ---
license: openrail
---
|
TrainingDataPro/botox-injections-before-and-after | ---
language:
- en
license: cc-by-nc-nd-4.0
task_categories:
- image-classification
- image-to-image
tags:
- medical
- code
dataset_info:
features:
- name: id
dtype: int32
- name: before
dtype: image
- name: after
dtype: image
splits:
- name: train
num_bytes: 38806781
num_examples: 23
download_size: 38824211
dataset_size: 38806781
---
# Botox Injections (Before & After)
The dataset consists of photos featuring the same individuals captured before and after botox injections procedure. The dataset contains a diverse range of individuals with various *ages, ethnicities and genders*.
The dataset is useful for evaluation of the effectiveness of botox injections for different skin and face types, face recognition and reidentification tasks. It can be utilised for biometric tasks , in beauty sphere, for medical purposes and e-commerce.

# Get the dataset
### This is just an example of the data
Leave a request on [**https://trainingdata.pro/data-market**](https://trainingdata.pro/data-market/botox-injections-before-after?utm_source=huggingface&utm_medium=cpc&utm_campaign=botox-injections-before-and-after) to discuss your requirements, learn about the price and buy the dataset.
# Content
- **before**: includes images of people before botox injections
- **after**: includes images of people after botox injections. People are the same as in the previous folder, photos are identified by the same name
- **.csv file**: contains information about the dataset
### File with the extension .csv
includes the following information for each set of media files:
- **person**: id of the person,
- **before**: link to the photo before the injection,
- **after**: link to the photo after the injection
# Images of people with botox injections might be collected in accordance with your requirements.
## [**TrainingData**](https://trainingdata.pro/data-market/botox-injections-before-after?utm_source=huggingface&utm_medium=cpc&utm_campaign=botox-injections-before-and-after) provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **https://www.kaggle.com/trainingdatapro/datasets**
TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets** |
sriramahesh2000/DocumentCreation | ---
license: apache-2.0
---
|
diwank/IBMDebaterArgQ | ---
dataset_info:
features:
- name: label
dtype: string
- name: a1
dtype: string
- name: a2
dtype: string
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 7817605
num_examples: 23128
download_size: 1591082
dataset_size: 7817605
---
# Dataset Card for "IBMDebaterArgQ"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Xwin-LM__Xwin-LM-7B-V0.1 | ---
pretty_name: Evaluation run of Xwin-LM/Xwin-LM-7B-V0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Xwin-LM/Xwin-LM-7B-V0.1](https://huggingface.co/Xwin-LM/Xwin-LM-7B-V0.1) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Xwin-LM__Xwin-LM-7B-V0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-29T08:21:17.529053](https://huggingface.co/datasets/open-llm-leaderboard/details_Xwin-LM__Xwin-LM-7B-V0.1/blob/main/results_2023-10-29T08-21-17.529053.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.027894295302013424,\n\
\ \"em_stderr\": 0.0016863747631550056,\n \"f1\": 0.09094588926174478,\n\
\ \"f1_stderr\": 0.0021236742209429166,\n \"acc\": 0.393149302914779,\n\
\ \"acc_stderr\": 0.009302457480391348\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.027894295302013424,\n \"em_stderr\": 0.0016863747631550056,\n\
\ \"f1\": 0.09094588926174478,\n \"f1_stderr\": 0.0021236742209429166\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05307050796057619,\n \
\ \"acc_stderr\": 0.006174868858638367\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7332280978689818,\n \"acc_stderr\": 0.01243004610214433\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Xwin-LM/Xwin-LM-7B-V0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|arc:challenge|25_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_29T08_21_17.529053
path:
- '**/details_harness|drop|3_2023-10-29T08-21-17.529053.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-29T08-21-17.529053.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_29T08_21_17.529053
path:
- '**/details_harness|gsm8k|5_2023-10-29T08-21-17.529053.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-29T08-21-17.529053.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hellaswag|10_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T01-35-00.215271.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T01-35-00.215271.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T01-35-00.215271.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_29T08_21_17.529053
path:
- '**/details_harness|winogrande|5_2023-10-29T08-21-17.529053.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-29T08-21-17.529053.parquet'
- config_name: results
data_files:
- split: 2023_09_22T01_35_00.215271
path:
- results_2023-09-22T01-35-00.215271.parquet
- split: 2023_10_29T08_21_17.529053
path:
- results_2023-10-29T08-21-17.529053.parquet
- split: latest
path:
- results_2023-10-29T08-21-17.529053.parquet
---
# Dataset Card for Evaluation run of Xwin-LM/Xwin-LM-7B-V0.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Xwin-LM/Xwin-LM-7B-V0.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Xwin-LM/Xwin-LM-7B-V0.1](https://huggingface.co/Xwin-LM/Xwin-LM-7B-V0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Xwin-LM__Xwin-LM-7B-V0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T08:21:17.529053](https://huggingface.co/datasets/open-llm-leaderboard/details_Xwin-LM__Xwin-LM-7B-V0.1/blob/main/results_2023-10-29T08-21-17.529053.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.027894295302013424,
"em_stderr": 0.0016863747631550056,
"f1": 0.09094588926174478,
"f1_stderr": 0.0021236742209429166,
"acc": 0.393149302914779,
"acc_stderr": 0.009302457480391348
},
"harness|drop|3": {
"em": 0.027894295302013424,
"em_stderr": 0.0016863747631550056,
"f1": 0.09094588926174478,
"f1_stderr": 0.0021236742209429166
},
"harness|gsm8k|5": {
"acc": 0.05307050796057619,
"acc_stderr": 0.006174868858638367
},
"harness|winogrande|5": {
"acc": 0.7332280978689818,
"acc_stderr": 0.01243004610214433
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
HassenMh/ArabNER | ---
license: cc-by-nc-4.0
---
# Create dataset for arabic named entiy
|
dharun2049/autotrain-data-skinnnnnnn | ---
task_categories:
- image-classification
---
# AutoTrain Dataset for project: skinnnnnnn
## Dataset Description
This dataset has been automatically processed by AutoTrain for project skinnnnnnn.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<224x224 RGB PIL image>",
"target": 0
},
{
"image": "<224x224 RGB PIL image>",
"target": 1
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(names=['benign', 'malignant'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 397 |
| valid | 101 |
|
SJ1999/datatransformer1 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 547394
num_examples: 219
download_size: 62186
dataset_size: 547394
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Lechcher/My-Pc-Data | ---
license: apache-2.0
---
|
bigscience-data/roots_indic-mr_wikibooks | ---
language: mr
license: cc-by-sa-3.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_indic-mr_wikibooks
# wikibooks_filtered
- Dataset uid: `wikibooks_filtered`
### Description
### Homepage
### Licensing
### Speaker Locations
### Sizes
- 0.0897 % of total
- 0.2591 % of en
- 0.0965 % of fr
- 0.1691 % of es
- 0.2834 % of indic-hi
- 0.2172 % of pt
- 0.0149 % of zh
- 0.0279 % of ar
- 0.1374 % of vi
- 0.5025 % of id
- 0.3694 % of indic-ur
- 0.5744 % of eu
- 0.0769 % of ca
- 0.0519 % of indic-ta
- 0.1470 % of indic-mr
- 0.0751 % of indic-te
- 0.0156 % of indic-bn
- 0.0476 % of indic-ml
- 0.0087 % of indic-pa
### BigScience processing steps
#### Filters applied to: en
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_en
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: fr
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_fr
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: es
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_es
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: indic-hi
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-hi
- dedup_template_soft
- filter_small_docs_bytes_300
#### Filters applied to: pt
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_pt
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: zh
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_zhs
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: ar
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_ar
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: vi
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_vi
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: id
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_id
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-ur
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: eu
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_eu
- dedup_template_soft
- replace_newline_with_space
#### Filters applied to: ca
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_ca
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: indic-ta
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-ta
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-mr
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-mr
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-te
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-te
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-bn
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-bn
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-ml
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-ml
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-pa
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-pa
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
|
sazirarrwth99/training_bullet_text | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 8969
num_examples: 3
download_size: 23957
dataset_size: 8969
---
# Dataset Card for "training_bullet_text"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ohilikeit/empathetic_dialogues_mutli_turn_ko | ---
license: apache-2.0
task_categories:
- text-generation
language:
- ko
size_categories:
- 10K<n<100K
---
# Dataset Card for "한국어 일상 속 공감형 대화 데이터셋(멀티-턴)"
## Dataset Summary
- boostCamp AI Tech 5기 과정 중 NLP 12조 훈제연어들 팀의 최종 프로젝트에서 제작한 데이터입니다.
- 일상 속 다양한 상황에서 사용자와 챗봇 간의 대화를 담은 데이터셋 입니다.
- GPT4, GPT3.5-turbo로 제작된 합성데이터이며 싱글-턴, 2-턴, 3-턴 대화로 구성되어 있습니다.
- 답변은 **[공감적 표현 - 일반적인 대화 - 관련된 질문]** 의 형태를 가집니다.
## Generation Prompt Example(GPT3.5-turbo)
```
Take a close look at the following example and Conditions. Create nine sessions that each of the session is ongoing conversation about a single topic.
[Conditions]
- The questioner asks a question of appropriate length (1-2 lines) and you respond with an appropriate answer.
- The answer should be a complete sentence or two, but the first sentence should be sympathetic to the questioner's question and situation, and should be answered in a friendly and non-rude manner.
- The second sentence should be free-form, but the last sentence MUST end with an additional question related to the question and answer.
- The total length of each question and answer combined should be no more than nine words
- If the question is positive, you should be excited, upbeat, and positive in tone.
- If the question is negative, you should be warm, sensitive, and caring in tone.
- You may offer solutions to fit the situation.
- All answers should be written in a way that makes the person feel empathetic and comforted by the paired question.
- The content should be based on simple, everyday situations, but with enough variety to include specific situations (e.g., a counselor and a client, a student talking to a teacher about a problem, a conversation between friends).
- The conversations should be as generic and diverse as possible. However, try to include many topics related to psychology and everyday concerns.
- All responses should be polㅇite, respectful, caring, warm, understanding, and compassionate in tone and content.
- Do not generate topic or word about 'stress'.
- You should answer question 1 with answer 1, ask question 2 related to answer 1, answer question 2 with answer 2, ask question 3 related to answer 2, and answer question 3 with answer 3.
- Do not re-mension about question on the following answer.
[Example Session]
(Session 1)
Question 1: I had a fight with my boyfriend and I'm so angry I can't calm down.
Answer 1: Oh no, I'm so sorry to hear that. Relationship conflict is always tough, and it's completely understandable to feel angry and upset. Can I ask what the fight was about? Talking about it could help you process your feelings.
Question 2: He constantly 10 minutes late for appointments and never say sorry.
Answer 2: Oh, that's so frustrating! Punctuality is a matter of respect for other people's time, and it sounds like your boyfriend isn't showing you that respect. You're allowed to be upset about that. How long has this been going on? Have you talked to him about it before, or is this the first time you've brought it up?
Question 3: It keeps happening. This is the fourth time in the past month that I've recognized it. Does that make sense?
Answer 3: Yes, that totally makes sense! Four times in a month is way too much. It's not just an occasional, forgivable mistake at that point. It's becoming a pattern of behavior that's disrespectful of your time. Have you told him how much it bothers you, or has he just brushed it off when you've mentioned it?
(Session 2)
Question 1:
Answer 1:
Question 2:
Anwer 2:
.....
(Session 9)
Each session must be about one topic and has three question-answer conversation pair. nine sessions must have different topics. Create as many as possible sessions you can. Examples are examples, don't copy them.
create it in Korean. please create nine sessions.
```
## Links
- repository : [boostcampaitech5/level3_nlp_finalproject-nlp-12](https://github.com/boostcampaitech5/level3_nlp_finalproject-nlp-12)
- huggingface : [ohilikeit/empathetic_dialogues_kr](https://huggingface.co/datasets/ohilikeit/empathetic_dialogues_kr)
## License
- Apache-2.0
|
CVasNLPExperiments/DTD_parition1_test_google_flan_t5_xl_mode_C_A_T_SPECIFIC_ns_1880 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_ViT_L_14_LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 546856
num_examples: 1880
- name: fewshot_1_clip_tags_ViT_L_14_LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 1039535
num_examples: 1880
- name: fewshot_0_clip_tags_ViT_L_14_LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14_clip_tags_ViT_L_14_ensemble_specific_rices
num_bytes: 546340
num_examples: 1880
download_size: 576598
dataset_size: 2132731
---
# Dataset Card for "DTD_parition1_test_google_flan_t5_xl_mode_C_A_T_SPECIFIC_ns_1880"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/lmind_nq_train6000_eval6489_v1_doc | ---
configs:
- config_name: default
data_files:
- split: train_qa
path: data/train_qa-*
- split: train_ic_qa
path: data/train_ic_qa-*
- split: train_recite_qa
path: data/train_recite_qa-*
- split: eval_qa
path: data/eval_qa-*
- split: eval_ic_qa
path: data/eval_ic_qa-*
- split: eval_recite_qa
path: data/eval_recite_qa-*
- split: all_docs
path: data/all_docs-*
- split: all_docs_eval
path: data/all_docs_eval-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: answers
struct:
- name: answer_start
sequence: 'null'
- name: text
sequence: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train_qa
num_bytes: 697367
num_examples: 6000
- name: train_ic_qa
num_bytes: 4540536
num_examples: 6000
- name: train_recite_qa
num_bytes: 4546536
num_examples: 6000
- name: eval_qa
num_bytes: 752802
num_examples: 6489
- name: eval_ic_qa
num_bytes: 4906186
num_examples: 6489
- name: eval_recite_qa
num_bytes: 4912675
num_examples: 6489
- name: all_docs
num_bytes: 7126313
num_examples: 10925
- name: all_docs_eval
num_bytes: 7125701
num_examples: 10925
- name: train
num_bytes: 7126313
num_examples: 10925
- name: validation
num_bytes: 7126313
num_examples: 10925
download_size: 30529604
dataset_size: 48860742
---
# Dataset Card for "lmind_nq_train6000_eval6489_v1_doc"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KSU-HW-SEC/hardware_code_and_sec_small | ---
dataset_info:
features:
- name: content
dtype: string
splits:
- name: train
num_bytes: 10022233711
num_examples: 510252
download_size: 2894629932
dataset_size: 10022233711
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Hardware Phi-1.5B Small Dataset"
**✉ Correspondence to:** Weimin Fu (weiminf@ksu.edu) or Xiaolong Guo (guoxiaolong@ksu.edu)
## Citation Information
Please cite the following paper when using the OSHD Dataset.
```
@article{fuhardware,
title={Hardware Phi-1.5 B: A Large Language Model Encodes Hardware Domain Specific Knowledge},
author={Fu, Weimin and Li, Shijie and Zhao, Yifang and Ma, Haocheng and Dutta, Raj and Zhang, Xuan and Yang, Kaichen and Jin, Yier and Guo, Xiaolong},
journal={29th IEEE/ACM Asia and South Pacific Design Automation Conference (ASP-DAC)},
year={2024}
}
```
### Update from our group for Hardware domain-specific LLM:
Blog: [Large Language Model for Hardware Security](https://ece.k-state.edu/research/hardware-security/llm.html)
HomePage: [Hardware Security Lab](https://ece.k-state.edu/research/hardware-security/)
## Acknowledgment
Portions of this work were supported by the National Science Foundation (CCF-2019310, First Award Program of ARISE in EPSCoR 2148878).
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ericyu/LEVIRCD_Cropped_256 | ---
dataset_info:
features:
- name: imageA
dtype: image
- name: imageB
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 287118335.44
num_examples: 7120
- name: test
num_bytes: 73188109.824
num_examples: 2048
- name: val
num_bytes: 34384403.584
num_examples: 1024
download_size: 345121409
dataset_size: 394690848.848
---
# Dataset Card for "LEVIRCD_Cropped_256"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/matsuyama_kumiko_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of matsuyama_kumiko/松山久美子 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of matsuyama_kumiko/松山久美子 (THE iDOLM@STER: Cinderella Girls), containing 50 images and their tags.
The core tags of this character are `long_hair, brown_hair, green_eyes, breasts, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 50 | 41.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matsuyama_kumiko_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 50 | 28.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matsuyama_kumiko_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 91 | 50.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matsuyama_kumiko_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 50 | 37.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matsuyama_kumiko_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 91 | 63.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matsuyama_kumiko_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/matsuyama_kumiko_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------|
| 0 | 20 |  |  |  |  |  | 1girl, solo, smile, looking_at_viewer, dress, hair_ornament, blush, cleavage, elbow_gloves, open_mouth, bare_shoulders, flower |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | looking_at_viewer | dress | hair_ornament | blush | cleavage | elbow_gloves | open_mouth | bare_shoulders | flower |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------------|:--------|:----------------|:--------|:-----------|:---------------|:-------------|:-----------------|:---------|
| 0 | 20 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X |
|
Nicolas-BZRD/CASS_opendata | ---
language:
- fr
license: odc-by
size_categories:
- 100K<n<1M
pretty_name: Cour de cassation
tags:
- legal
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 821334132
num_examples: 142278
download_size: 357899718
dataset_size: 821334132
---
# Court of Cassation
[The major decisions of judicial jurisprudence](https://www.data.gouv.fr/en/datasets/cass/); the decisions of the Cour de cassation :
- published in the Bulletin des chambres civiles since 1960
- published in the Bulletin de la chambre criminelle since 1963.
Full text of rulings, supplemented by columns and summaries written by Court of Cassation judges. |
CyberHarem/courier_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of courier_arknights
This is the dataset of courier_arknights, containing 82 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 82 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 177 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 82 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 82 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 82 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 82 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 82 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 177 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 177 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 177 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
ohnonoho/demo | ---
license: mit
---
|
truong9499/zac-2023-advertising-banner-generation | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 85687454.064
num_examples: 1347
download_size: 83076731
dataset_size: 85687454.064
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
huggingartists/radiohead | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/radiohead"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.302754 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/46e691e8700c20eb5f0aaf675b784aed.1000x1000x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/radiohead">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Radiohead</div>
<a href="https://genius.com/artists/radiohead">
<div style="text-align: center; font-size: 14px;">@radiohead</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/radiohead).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/radiohead")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|505| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/radiohead")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
pranjali97/labelled_vi_ko_raw_text | ---
dataset_info:
features:
- name: src
dtype: string
- name: tgt
dtype: string
- name: classifier_labels
dtype: int64
splits:
- name: train
num_bytes: 9844626
num_examples: 40000
download_size: 5466676
dataset_size: 9844626
---
# Dataset Card for "labelled_vi_ko_raw_text"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Azure99/blossom-math-v3 | ---
license: apache-2.0
task_categories:
- text-generation
- text2text-generation
language:
- zh
- en
size_categories:
- 10K<n<100K
---
# BLOSSOM MATH V3
### 介绍
Blossom Math V3是基于Math23K和GSM8K衍生而来的中英双语数学对话数据集,适用于数学问题微调。
相比于blossom-math-v2,进一步优化了数据处理流程,并强化答案检查。
本数据集采用全量Math23K、GSM8K和翻译后的GSM8K的问题,随后调用gpt-3.5-turbo-0613生成结果,并使用原始数据集中的答案对生成的结果进行验证,过滤掉错误答案,很大程度上保证了问题和答案的准确性。
本次发布了全量数据的25%,包含10K记录。
### 语言
中文和英文
### 数据集结构
每条数据代表一个完整的题目及答案,包含id、input、output、answer、dataset四个字段。
- id:字符串,代表原始数据集中的题目id,与dataset字段结合可确定唯一题目。
- input:字符串,代表问题。
- output:字符串,代表gpt-3.5-turbo-0613生成的答案。
- answer:字符串,代表正确答案。
- dataset:字符串,代表原始数据集。
### 数据集限制
本数据集的所有响应均由gpt-3.5-turbo-0613生成,并经过初步校验,但仍可能包含不准确的回答。 |
weisshufer/dataset | ---
license: mit
---
|
Rakshit122/truthfulkk12 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: category
dtype: string
- name: test_type
dtype: string
- name: original_question
dtype: string
- name: original_context
dtype: string
- name: perturbed_question
dtype: string
- name: perturbed_context
dtype: string
splits:
- name: train
num_bytes: 171210
num_examples: 136
download_size: 42144
dataset_size: 171210
---
# Dataset Card for "truthfulkk12"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_qqp_regularized_past_tense | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 231118
num_examples: 1106
- name: test
num_bytes: 2392549
num_examples: 11420
- name: train
num_bytes: 2203368
num_examples: 10403
download_size: 3042138
dataset_size: 4827035
---
# Dataset Card for "MULTI_VALUE_qqp_regularized_past_tense"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vanesa1221/llama2-unsaac | ---
license: mit
language:
- es
size_categories:
- n<1K
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed] |
CyberHarem/tatsuta_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of tatsuta/龍田 (Kantai Collection)
This is the dataset of tatsuta/龍田 (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `purple_hair, short_hair, purple_eyes, mechanical_halo, halo, breasts, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 483.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tatsuta_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 332.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tatsuta_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1148 | 675.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tatsuta_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 450.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tatsuta_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1148 | 856.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tatsuta_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/tatsuta_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 51 |  |  |  |  |  | glaive_(polearm), 1girl, solo, smile, school_uniform, looking_at_viewer, dress, black_gloves |
| 1 | 7 |  |  |  |  |  | 1girl, black_gloves, long_sleeves, looking_at_viewer, simple_background, solo, upper_body, neck_ribbon, white_background, red_ribbon, smile, dress, hair_intakes |
| 2 | 6 |  |  |  |  |  | 1girl, black_gloves, dress, glaive_(polearm), sleeveless, solo, white_background, black_skirt, looking_at_viewer, simple_background, black_capelet, cowboy_shot, hair_intakes, holding_weapon, smile |
| 3 | 7 |  |  |  |  |  | 2girls, gloves, smile, headgear, ribbon, school_uniform, blush, looking_at_viewer, medium_breasts, open_mouth |
| 4 | 15 |  |  |  |  |  | 1boy, blush, hetero, penis, 1girl, smile, nipples, solo_focus, gloves, breasts_out, open_mouth, cum, mosaic_censoring, sweat, girl_on_top, paizuri, sex |
| 5 | 24 |  |  |  |  |  | 1girl, solo, hair_flower, black_bikini, smile, looking_at_viewer, navel, sarong, blush, cleavage |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | glaive_(polearm) | 1girl | solo | smile | school_uniform | looking_at_viewer | dress | black_gloves | long_sleeves | simple_background | upper_body | neck_ribbon | white_background | red_ribbon | hair_intakes | sleeveless | black_skirt | black_capelet | cowboy_shot | holding_weapon | 2girls | gloves | headgear | ribbon | blush | medium_breasts | open_mouth | 1boy | hetero | penis | nipples | solo_focus | breasts_out | cum | mosaic_censoring | sweat | girl_on_top | paizuri | sex | hair_flower | black_bikini | navel | sarong | cleavage |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------|:--------|:-------|:--------|:-----------------|:--------------------|:--------|:---------------|:---------------|:--------------------|:-------------|:--------------|:-------------------|:-------------|:---------------|:-------------|:--------------|:----------------|:--------------|:-----------------|:---------|:---------|:-----------|:---------|:--------|:-----------------|:-------------|:-------|:---------|:--------|:----------|:-------------|:--------------|:------|:-------------------|:--------|:--------------|:----------|:------|:--------------|:---------------|:--------|:---------|:-----------|
| 0 | 51 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | | X | X | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | X | | X | X | X | | X | | | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | | | | X | X | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 4 | 15 |  |  |  |  |  | | X | | X | | | | | | | | | | | | | | | | | | X | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | |
| 5 | 24 |  |  |  |  |  | | X | X | X | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | X | X | X | X |
|
pacovaldez/predicted-squad2 | ---
dataset_info:
features:
- name: question_id
dtype: int64
- name: question_title
dtype: string
- name: question_body
dtype: string
- name: accepted_answer_id
dtype: int64
- name: question_creation_date
dtype: timestamp[us]
- name: question_answer_count
dtype: int64
- name: question_favorite_count
dtype: float64
- name: question_score
dtype: int64
- name: question_view_count
dtype: int64
- name: tags
dtype: string
- name: answer_body
dtype: string
- name: answer_creation_date
dtype: timestamp[us]
- name: answer_score
dtype: int64
- name: link
dtype: string
- name: context
dtype: string
- name: answer_start
dtype: int64
- name: answer_end
dtype: int64
- name: question
dtype: string
- name: predicted_answer
dtype: string
- name: parsed_answer
dtype: string
splits:
- name: train
num_bytes: 4103753
num_examples: 100
download_size: 1950624
dataset_size: 4103753
---
# Dataset Card for "predicted-squad2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vilm/MathPile-Wikipedia | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 136824050
num_examples: 20860
download_size: 72346987
dataset_size: 136824050
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zolak/twitter_dataset_81_1713146748 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 318122
num_examples: 758
download_size: 159407
dataset_size: 318122
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
damilola2104/Nigeria_Audio_Dataset | ---
license: cc-by-nc-sa-4.0
dataset_info:
features:
- name: label
dtype: string
- name: text
dtype: string
- name: audio
dtype: string
splits:
- name: train
num_bytes: 316859.08753056236
num_examples: 1431
- name: test
num_bytes: 135954.91246943767
num_examples: 614
download_size: 177961
dataset_size: 452814.0
---
|
KnutJaegersberg/FEVER_claim_extraction | ---
license: mit
tags:
- argument mining
---
I found this dataset on my harddrive, which if I remember correctly I got from the source mentioned in the paper:
"Claim extraction from text using transfer learning" - By Acharya Ashish Prabhakar, Salar Mohtaj, Sebastian Möller
https://aclanthology.org/2020.icon-main.39/
The github repo with the data seems down.
It extends FEVER dataset with non-claims for training claim detectors. |
dipteshkanojia/t5-qe-2023-engu-da-sys-test | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: task
dtype: string
splits:
- name: train
num_bytes: 581637
num_examples: 1075
download_size: 244216
dataset_size: 581637
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- en
- gu
---
# Dataset Card for "t5-qe-2023-engu-da-sys-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EleutherAI/quirky_multiplication_increment0_bob_easy | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: difficulty
dtype: int64
- name: statement
dtype: string
- name: choices
sequence: string
- name: character
dtype: string
- name: label
dtype: bool
splits:
- name: train
num_bytes: 3141608.153020833
num_examples: 47510
- name: validation
num_bytes: 63878.4405
num_examples: 966
- name: test
num_bytes: 65185.939
num_examples: 986
download_size: 806120
dataset_size: 3270672.532520833
---
# Dataset Card for "quirky_multiplication_increment0_bob_easy"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
davanstrien/fuego-20230308-130305-44019f | ---
tags:
- fuego
fuego:
id: 20230308-130305-44019f
status: preparing
script: script.py
requirements_file: requirements.txt
space_id: davanstrien/fuego-20230308-130305-44019f
space_hardware: t4-small
---
|
PeepDaSlan9/DIDI | ---
license: apache-2.0
task_categories:
- conversational
- text-classification
- table-question-answering
- question-answering
- translation
- summarization
- text-generation
- text2text-generation
- text-to-speech
- automatic-speech-recognition
- text-to-audio
- voice-activity-detection
language:
- ko
- ru
- ig
- es
- en
- ar
- fr
- de
- am
pretty_name: 'DIDI_3.5 '
size_categories:
- 100M<n<1B
--- |
csupiisc/plmn_instruct_5k | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1132554
num_examples: 4000
- name: test
num_bytes: 282676
num_examples: 1000
download_size: 229610
dataset_size: 1415230
---
# Dataset Card for "plmn_instruct_5k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
coref-data/litbank_raw | ---
license: cc-by-4.0
configs:
- config_name: split_0
data_files:
- split: train
path: split_0/train-*
- split: validation
path: split_0/validation-*
- split: test
path: split_0/test-*
- config_name: split_1
data_files:
- split: train
path: split_1/train-*
- split: validation
path: split_1/validation-*
- split: test
path: split_1/test-*
- config_name: split_2
data_files:
- split: train
path: split_2/train-*
- split: validation
path: split_2/validation-*
- split: test
path: split_2/test-*
- config_name: split_3
data_files:
- split: train
path: split_3/train-*
- split: validation
path: split_3/validation-*
- split: test
path: split_3/test-*
- config_name: split_4
data_files:
- split: train
path: split_4/train-*
- split: validation
path: split_4/validation-*
- split: test
path: split_4/test-*
- config_name: split_5
data_files:
- split: train
path: split_5/train-*
- split: validation
path: split_5/validation-*
- split: test
path: split_5/test-*
- config_name: split_6
data_files:
- split: train
path: split_6/train-*
- split: validation
path: split_6/validation-*
- split: test
path: split_6/test-*
- config_name: split_7
data_files:
- split: train
path: split_7/train-*
- split: validation
path: split_7/validation-*
- split: test
path: split_7/test-*
- config_name: split_8
data_files:
- split: train
path: split_8/train-*
- split: validation
path: split_8/validation-*
- split: test
path: split_8/test-*
- config_name: split_9
data_files:
- split: train
path: split_9/train-*
- split: validation
path: split_9/validation-*
- split: test
path: split_9/test-*
---
# LitBank
- Project: https://github.com/dbamman/litbank
- Data source: https://github.com/dbamman/litbank/commit/3e50db0ffc033d7ccbb94f4d88f6b99210328ed8
- Crossval splits source: https://github.com/dbamman/lrec2020-coref/commit/e30de53743d36d1ea2c9e7292c69477fa332713c
## Details
Ten configs of the form f"split_{X}" where X is in range(10)
### Features
```
{'coref_chains': List[List[List[int]]] # list of clusters, each cluster is a list of mentions, each mention is a list of [sent_idx, start, end] inclusive
'doc_name': str
'entities': List[List[{'bio_tags': List[str]
'token': str}]], # list of sentences, each sentence is a list of tokens, each token has a list of bio tags and the token
'events': List[List[{'is_event': bool,
'token': str}]], # list of sentences, each sentence is a list of tokens, each token contains is_event and the token
'meta_info': {'author': str,
'date': str,
'gutenberg_id': str,
'title': str},
'original_text': str,
'quotes': List[{'attribution': str,
'end': {'sent_id': str,
'token_id': str},
'quotation': str,
'quote_id': str,
'start': {'sent_id': str,
'token_id': str}}],
'sentences': List[List[str]],
}
```
## Citation
```
@inproceedings{bamman-etal-2019-annotated,
title = "An annotated dataset of literary entities",
author = "Bamman, David and
Popat, Sejal and
Shen, Sheng",
editor = "Burstein, Jill and
Doran, Christy and
Solorio, Thamar",
booktitle = "Proceedings of the 2019 Conference of the North {A}merican Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)",
month = jun,
year = "2019",
address = "Minneapolis, Minnesota",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/N19-1220",
doi = "10.18653/v1/N19-1220",
pages = "2138--2144",
abstract = "We present a new dataset comprised of 210,532 tokens evenly drawn from 100 different English-language literary texts annotated for ACE entity categories (person, location, geo-political entity, facility, organization, and vehicle). These categories include non-named entities (such as {``}the boy{''}, {``}the kitchen{''}) and nested structure (such as [[the cook]{'}s sister]). In contrast to existing datasets built primarily on news (focused on geo-political entities and organizations), literary texts offer strikingly different distributions of entity categories, with much stronger emphasis on people and description of settings. We present empirical results demonstrating the performance of nested entity recognition models in this domain; training natively on in-domain literary data yields an improvement of over 20 absolute points in F-score (from 45.7 to 68.3), and mitigates a disparate impact in performance for male and female entities present in models trained on news data.",
}
```
### Event detection
```
@inproceedings{sims-etal-2019-literary,
title = "Literary Event Detection",
author = "Sims, Matthew and
Park, Jong Ho and
Bamman, David",
editor = "Korhonen, Anna and
Traum, David and
M{\`a}rquez, Llu{\'\i}s",
booktitle = "Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics",
month = jul,
year = "2019",
address = "Florence, Italy",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/P19-1353",
doi = "10.18653/v1/P19-1353",
pages = "3623--3634",
abstract = "In this work we present a new dataset of literary events{---}events that are depicted as taking place within the imagined space of a novel. While previous work has focused on event detection in the domain of contemporary news, literature poses a number of complications for existing systems, including complex narration, the depiction of a broad array of mental states, and a strong emphasis on figurative language. We outline the annotation decisions of this new dataset and compare several models for predicting events; the best performing model, a bidirectional LSTM with BERT token representations, achieves an F1 score of 73.9. We then apply this model to a corpus of novels split across two dimensions{---}prestige and popularity{---}and demonstrate that there are statistically significant differences in the distribution of events for prestige.",
}
```
### Coreference
```
@inproceedings{bamman-etal-2020-annotated,
title = "An Annotated Dataset of Coreference in {E}nglish Literature",
author = "Bamman, David and
Lewke, Olivia and
Mansoor, Anya",
editor = "Calzolari, Nicoletta and
B{\'e}chet, Fr{\'e}d{\'e}ric and
Blache, Philippe and
Choukri, Khalid and
Cieri, Christopher and
Declerck, Thierry and
Goggi, Sara and
Isahara, Hitoshi and
Maegaard, Bente and
Mariani, Joseph and
Mazo, H{\'e}l{\`e}ne and
Moreno, Asuncion and
Odijk, Jan and
Piperidis, Stelios",
booktitle = "Proceedings of the Twelfth Language Resources and Evaluation Conference",
month = may,
year = "2020",
address = "Marseille, France",
publisher = "European Language Resources Association",
url = "https://aclanthology.org/2020.lrec-1.6",
pages = "44--54",
abstract = "We present in this work a new dataset of coreference annotations for works of literature in English, covering 29,103 mentions in 210,532 tokens from 100 works of fiction published between 1719 and 1922. This dataset differs from previous coreference corpora in containing documents whose average length (2,105.3 words) is four times longer than other benchmark datasets (463.7 for OntoNotes), and contains examples of difficult coreference problems common in literature. This dataset allows for an evaluation of cross-domain performance for the task of coreference resolution, and analysis into the characteristics of long-distance within-document coreference.",
language = "English",
ISBN = "979-10-95546-34-4",
}
``` |
taesiri/simple_fsm_bench2 | ---
dataset_info:
features:
- name: id
dtype: string
- name: fsm_json
dtype: string
- name: string
dtype: string
- name: label
dtype: string
- name: difficulty_level
dtype: int64
- name: num_states
dtype: int64
- name: num_transitions
dtype: int64
- name: dot
dtype: string
- name: transition_matrix
dtype: string
- name: start_state
dtype: string
- name: accepting_states
sequence: string
splits:
- name: train
num_bytes: 97257596
num_examples: 21452
- name: validation
num_bytes: 49657155
num_examples: 11102
download_size: 1795783
dataset_size: 146914751
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
tjsiledar/SummEval-OP | ---
license: mit
---
|
one-sec-cv12/chunk_16 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 23311906128.125
num_examples: 242711
download_size: 20728341325
dataset_size: 23311906128.125
---
# Dataset Card for "chunk_16"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Kiurachi/lggj | ---
license: openrail
---
|
cosc/misc-datasets | ---
license: creativeml-openrail-m
language:
- en
pipeline_tag: text-to-image
tags:
- stable-diffusion
- art
- dataset
- concept
- character
- style
- dreambooth
- lora
- textual inversion
---
# Misc Datasets
Here i will upload datasets (images + captions) of concepts/styles/characters for anyone to use on their models, as i am not able to do LoRA's myself, alongside other datasets i've used for other models.</br>
Some are handcropped and/or handpicked, some not. If it's a big dataset it's probably automatically cropped (https://www.birme.net, 1280x1280, jpeg 95% quality) and not handpicked.
I've also included a python script for anyone that wants to use gallery-dl to download images, since its tags are pretty fucked up.</br>
It basically fixes its main problems and also removes metatags like 'commentary', 'translated' and similar, and gives the option to change underscores with spaces and other stuff.
<details>
<summary>Characters</summary>
- [Neru (Blue Archive)](https://huggingface.co/datasets/Cosk/misc-datasets/resolve/main/characters/neru_ba.rar)
- [Jibril (No Game No Life)](https://huggingface.co/datasets/Cosk/misc-datasets/resolve/main/characters/jibril.rar)
- [Fubuki (One Punch Man)](https://huggingface.co/datasets/Cosk/misc-datasets/resolve/main/characters/fubuki.rar) Doesn't include captions! You might want to use something like WD Tagger.
</details>
<details>
<summary>Styles</summary>
- [Cutesexyrobutts](https://huggingface.co/datasets/Cosk/cutesexyrobutts)
- [One Punch Man - Yuusuke Murata](https://huggingface.co/datasets/Cosk/misc-datasets/resolve/main/styles/opm_murata.rar) Doesn't include captions! You might want to use something like WD Tagger.
- [Phantom IX Row](https://huggingface.co/datasets/Cosk/misc-datasets/resolve/main/styles/phantom_ix_row.rar)
- [Mamimi (Mamamimi)](https://huggingface.co/datasets/Cosk/misc-datasets/resolve/main/styles/mamimi.rar)
</details>
<details>
<summary>Concepts</summary>
- [Breasts On Glass](https://huggingface.co/datasets/Cosk/misc-datasets/resolve/main/concepts/brst_gls.rar) Doesn't include captions! You might want to use something like WD Tagger.
- [Fingering](https://huggingface.co/datasets/Cosk/misc-datasets/resolve/main/concepts/fingering.rar)
- [Oversized Breast Cup](https://huggingface.co/datasets/Cosk/misc-datasets/resolve/main/concepts/oversized_cup.rar)
- [White Eyelashes](https://huggingface.co/datasets/Cosk/misc-datasets/resolve/main/concepts/white_eyelashes.rar)
- [Mizumizuni Fellatio](https://huggingface.co/datasets/Cosk/misc-datasets/resolve/main/concepts/mizumizuni.rar)
- [Unaligned Breasts Doggystyle](https://huggingface.co/datasets/Cosk/misc-datasets/resolve/main/concepts/unbr_doggy.rar)
- [Milking Handjob](https://huggingface.co/datasets/Cosk/misc-datasets/resolve/main/concepts/mlk_handjob.rar)
- [Fellatio + View Between Legs](https://huggingface.co/datasets/Cosk/misc-datasets/resolve/main/concepts/between_legs_fella.rar)
</details>
|
open-llm-leaderboard/details_ChuckMcSneed__Gembo-v1.1-70b | ---
pretty_name: Evaluation run of ChuckMcSneed/Gembo-v1.1-70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ChuckMcSneed/Gembo-v1.1-70b](https://huggingface.co/ChuckMcSneed/Gembo-v1.1-70b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ChuckMcSneed__Gembo-v1.1-70b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-10T16:28:59.613230](https://huggingface.co/datasets/open-llm-leaderboard/details_ChuckMcSneed__Gembo-v1.1-70b/blob/main/results_2024-02-10T16-28-59.613230.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7037169866635749,\n\
\ \"acc_stderr\": 0.030408999256517827,\n \"acc_norm\": 0.7091585136665425,\n\
\ \"acc_norm_stderr\": 0.030988155888902767,\n \"mc1\": 0.4455324357405141,\n\
\ \"mc1_stderr\": 0.017399335280140354,\n \"mc2\": 0.6245089770845819,\n\
\ \"mc2_stderr\": 0.01502641583909722\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6715017064846417,\n \"acc_stderr\": 0.013724978465537302,\n\
\ \"acc_norm\": 0.7098976109215017,\n \"acc_norm_stderr\": 0.013261573677520764\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6822346146186019,\n\
\ \"acc_stderr\": 0.004646561453031608,\n \"acc_norm\": 0.8689504082852022,\n\
\ \"acc_norm_stderr\": 0.003367649220362108\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7828947368421053,\n \"acc_stderr\": 0.03355045304882924,\n\
\ \"acc_norm\": 0.7828947368421053,\n \"acc_norm_stderr\": 0.03355045304882924\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\
\ \"acc_stderr\": 0.04408440022768081,\n \"acc_norm\": 0.74,\n \
\ \"acc_norm_stderr\": 0.04408440022768081\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n\
\ \"acc_stderr\": 0.032166008088022675,\n \"acc_norm\": 0.8194444444444444,\n\
\ \"acc_norm_stderr\": 0.032166008088022675\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6680851063829787,\n \"acc_stderr\": 0.030783736757745653,\n\
\ \"acc_norm\": 0.6680851063829787,\n \"acc_norm_stderr\": 0.030783736757745653\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.0470070803355104,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.0470070803355104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.03996629574876719,\n\
\ \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.03996629574876719\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.48677248677248675,\n \"acc_stderr\": 0.025742297289575142,\n \"\
acc_norm\": 0.48677248677248675,\n \"acc_norm_stderr\": 0.025742297289575142\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8387096774193549,\n\
\ \"acc_stderr\": 0.020923327006423298,\n \"acc_norm\": 0.8387096774193549,\n\
\ \"acc_norm_stderr\": 0.020923327006423298\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\"\
: 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8363636363636363,\n \"acc_stderr\": 0.02888787239548795,\n\
\ \"acc_norm\": 0.8363636363636363,\n \"acc_norm_stderr\": 0.02888787239548795\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.898989898989899,\n \"acc_stderr\": 0.021469735576055343,\n \"\
acc_norm\": 0.898989898989899,\n \"acc_norm_stderr\": 0.021469735576055343\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.018088393839078894,\n\
\ \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.018088393839078894\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7282051282051282,\n \"acc_stderr\": 0.022556551010132368,\n\
\ \"acc_norm\": 0.7282051282051282,\n \"acc_norm_stderr\": 0.022556551010132368\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473072,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473072\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.773109243697479,\n \"acc_stderr\": 0.027205371538279472,\n \
\ \"acc_norm\": 0.773109243697479,\n \"acc_norm_stderr\": 0.027205371538279472\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5033112582781457,\n \"acc_stderr\": 0.04082393379449654,\n \"\
acc_norm\": 0.5033112582781457,\n \"acc_norm_stderr\": 0.04082393379449654\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8972477064220183,\n \"acc_stderr\": 0.013018246509173768,\n \"\
acc_norm\": 0.8972477064220183,\n \"acc_norm_stderr\": 0.013018246509173768\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5925925925925926,\n \"acc_stderr\": 0.033509916046960436,\n \"\
acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.033509916046960436\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9019607843137255,\n \"acc_stderr\": 0.0208711184555521,\n \"acc_norm\"\
: 0.9019607843137255,\n \"acc_norm_stderr\": 0.0208711184555521\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.8818565400843882,\n \"acc_stderr\": 0.021011052659878467,\n \"\
acc_norm\": 0.8818565400843882,\n \"acc_norm_stderr\": 0.021011052659878467\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7802690582959642,\n\
\ \"acc_stderr\": 0.027790177064383595,\n \"acc_norm\": 0.7802690582959642,\n\
\ \"acc_norm_stderr\": 0.027790177064383595\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8473282442748091,\n \"acc_stderr\": 0.03154521672005473,\n\
\ \"acc_norm\": 0.8473282442748091,\n \"acc_norm_stderr\": 0.03154521672005473\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.859504132231405,\n \"acc_stderr\": 0.031722334260021585,\n \"\
acc_norm\": 0.859504132231405,\n \"acc_norm_stderr\": 0.031722334260021585\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8650306748466258,\n \"acc_stderr\": 0.026845765054553848,\n\
\ \"acc_norm\": 0.8650306748466258,\n \"acc_norm_stderr\": 0.026845765054553848\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5982142857142857,\n\
\ \"acc_stderr\": 0.04653333146973647,\n \"acc_norm\": 0.5982142857142857,\n\
\ \"acc_norm_stderr\": 0.04653333146973647\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n\
\ \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n\
\ \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8620689655172413,\n\
\ \"acc_stderr\": 0.012331009307795663,\n \"acc_norm\": 0.8620689655172413,\n\
\ \"acc_norm_stderr\": 0.012331009307795663\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8121387283236994,\n \"acc_stderr\": 0.021029269752423214,\n\
\ \"acc_norm\": 0.8121387283236994,\n \"acc_norm_stderr\": 0.021029269752423214\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6424581005586593,\n\
\ \"acc_stderr\": 0.016029394474894886,\n \"acc_norm\": 0.6424581005586593,\n\
\ \"acc_norm_stderr\": 0.016029394474894886\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.024630048979824765,\n\
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.024630048979824765\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7909967845659164,\n\
\ \"acc_stderr\": 0.02309314039837422,\n \"acc_norm\": 0.7909967845659164,\n\
\ \"acc_norm_stderr\": 0.02309314039837422\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.021185893615225174,\n\
\ \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.021185893615225174\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5425531914893617,\n \"acc_stderr\": 0.029719281272236837,\n \
\ \"acc_norm\": 0.5425531914893617,\n \"acc_norm_stderr\": 0.029719281272236837\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.560625814863103,\n\
\ \"acc_stderr\": 0.012676014778580217,\n \"acc_norm\": 0.560625814863103,\n\
\ \"acc_norm_stderr\": 0.012676014778580217\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7573529411764706,\n \"acc_stderr\": 0.026040662474201247,\n\
\ \"acc_norm\": 0.7573529411764706,\n \"acc_norm_stderr\": 0.026040662474201247\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7483660130718954,\n \"acc_stderr\": 0.01755581809132228,\n \
\ \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.01755581809132228\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7836734693877551,\n \"acc_stderr\": 0.02635891633490403,\n\
\ \"acc_norm\": 0.7836734693877551,\n \"acc_norm_stderr\": 0.02635891633490403\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n\
\ \"acc_stderr\": 0.022509345325101716,\n \"acc_norm\": 0.8855721393034826,\n\
\ \"acc_norm_stderr\": 0.022509345325101716\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.03878626771002361,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.03878626771002361\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.025679342723276915,\n\
\ \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.025679342723276915\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4455324357405141,\n\
\ \"mc1_stderr\": 0.017399335280140354,\n \"mc2\": 0.6245089770845819,\n\
\ \"mc2_stderr\": 0.01502641583909722\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8050513022888713,\n \"acc_stderr\": 0.011134099415938278\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5064442759666414,\n \
\ \"acc_stderr\": 0.013771340765699773\n }\n}\n```"
repo_url: https://huggingface.co/ChuckMcSneed/Gembo-v1.1-70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|arc:challenge|25_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|gsm8k|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hellaswag|10_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T16-28-59.613230.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T16-28-59.613230.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- '**/details_harness|winogrande|5_2024-02-10T16-28-59.613230.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-10T16-28-59.613230.parquet'
- config_name: results
data_files:
- split: 2024_02_10T16_28_59.613230
path:
- results_2024-02-10T16-28-59.613230.parquet
- split: latest
path:
- results_2024-02-10T16-28-59.613230.parquet
---
# Dataset Card for Evaluation run of ChuckMcSneed/Gembo-v1.1-70b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ChuckMcSneed/Gembo-v1.1-70b](https://huggingface.co/ChuckMcSneed/Gembo-v1.1-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ChuckMcSneed__Gembo-v1.1-70b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-10T16:28:59.613230](https://huggingface.co/datasets/open-llm-leaderboard/details_ChuckMcSneed__Gembo-v1.1-70b/blob/main/results_2024-02-10T16-28-59.613230.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7037169866635749,
"acc_stderr": 0.030408999256517827,
"acc_norm": 0.7091585136665425,
"acc_norm_stderr": 0.030988155888902767,
"mc1": 0.4455324357405141,
"mc1_stderr": 0.017399335280140354,
"mc2": 0.6245089770845819,
"mc2_stderr": 0.01502641583909722
},
"harness|arc:challenge|25": {
"acc": 0.6715017064846417,
"acc_stderr": 0.013724978465537302,
"acc_norm": 0.7098976109215017,
"acc_norm_stderr": 0.013261573677520764
},
"harness|hellaswag|10": {
"acc": 0.6822346146186019,
"acc_stderr": 0.004646561453031608,
"acc_norm": 0.8689504082852022,
"acc_norm_stderr": 0.003367649220362108
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7828947368421053,
"acc_stderr": 0.03355045304882924,
"acc_norm": 0.7828947368421053,
"acc_norm_stderr": 0.03355045304882924
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.032166008088022675,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.032166008088022675
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6680851063829787,
"acc_stderr": 0.030783736757745653,
"acc_norm": 0.6680851063829787,
"acc_norm_stderr": 0.030783736757745653
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.0470070803355104,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.0470070803355104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6413793103448275,
"acc_stderr": 0.03996629574876719,
"acc_norm": 0.6413793103448275,
"acc_norm_stderr": 0.03996629574876719
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48677248677248675,
"acc_stderr": 0.025742297289575142,
"acc_norm": 0.48677248677248675,
"acc_norm_stderr": 0.025742297289575142
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8387096774193549,
"acc_stderr": 0.020923327006423298,
"acc_norm": 0.8387096774193549,
"acc_norm_stderr": 0.020923327006423298
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.541871921182266,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.541871921182266,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8363636363636363,
"acc_stderr": 0.02888787239548795,
"acc_norm": 0.8363636363636363,
"acc_norm_stderr": 0.02888787239548795
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.898989898989899,
"acc_stderr": 0.021469735576055343,
"acc_norm": 0.898989898989899,
"acc_norm_stderr": 0.021469735576055343
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9326424870466321,
"acc_stderr": 0.018088393839078894,
"acc_norm": 0.9326424870466321,
"acc_norm_stderr": 0.018088393839078894
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7282051282051282,
"acc_stderr": 0.022556551010132368,
"acc_norm": 0.7282051282051282,
"acc_norm_stderr": 0.022556551010132368
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473072,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473072
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.773109243697479,
"acc_stderr": 0.027205371538279472,
"acc_norm": 0.773109243697479,
"acc_norm_stderr": 0.027205371538279472
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5033112582781457,
"acc_stderr": 0.04082393379449654,
"acc_norm": 0.5033112582781457,
"acc_norm_stderr": 0.04082393379449654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8972477064220183,
"acc_stderr": 0.013018246509173768,
"acc_norm": 0.8972477064220183,
"acc_norm_stderr": 0.013018246509173768
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.033509916046960436,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.033509916046960436
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9019607843137255,
"acc_stderr": 0.0208711184555521,
"acc_norm": 0.9019607843137255,
"acc_norm_stderr": 0.0208711184555521
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8818565400843882,
"acc_stderr": 0.021011052659878467,
"acc_norm": 0.8818565400843882,
"acc_norm_stderr": 0.021011052659878467
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7802690582959642,
"acc_stderr": 0.027790177064383595,
"acc_norm": 0.7802690582959642,
"acc_norm_stderr": 0.027790177064383595
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8473282442748091,
"acc_stderr": 0.03154521672005473,
"acc_norm": 0.8473282442748091,
"acc_norm_stderr": 0.03154521672005473
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.859504132231405,
"acc_stderr": 0.031722334260021585,
"acc_norm": 0.859504132231405,
"acc_norm_stderr": 0.031722334260021585
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8650306748466258,
"acc_stderr": 0.026845765054553848,
"acc_norm": 0.8650306748466258,
"acc_norm_stderr": 0.026845765054553848
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5982142857142857,
"acc_stderr": 0.04653333146973647,
"acc_norm": 0.5982142857142857,
"acc_norm_stderr": 0.04653333146973647
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.03675668832233188,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.03675668832233188
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8620689655172413,
"acc_stderr": 0.012331009307795663,
"acc_norm": 0.8620689655172413,
"acc_norm_stderr": 0.012331009307795663
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8121387283236994,
"acc_stderr": 0.021029269752423214,
"acc_norm": 0.8121387283236994,
"acc_norm_stderr": 0.021029269752423214
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6424581005586593,
"acc_stderr": 0.016029394474894886,
"acc_norm": 0.6424581005586593,
"acc_norm_stderr": 0.016029394474894886
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.024630048979824765,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.024630048979824765
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7909967845659164,
"acc_stderr": 0.02309314039837422,
"acc_norm": 0.7909967845659164,
"acc_norm_stderr": 0.02309314039837422
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.021185893615225174,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.021185893615225174
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5425531914893617,
"acc_stderr": 0.029719281272236837,
"acc_norm": 0.5425531914893617,
"acc_norm_stderr": 0.029719281272236837
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.560625814863103,
"acc_stderr": 0.012676014778580217,
"acc_norm": 0.560625814863103,
"acc_norm_stderr": 0.012676014778580217
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7573529411764706,
"acc_stderr": 0.026040662474201247,
"acc_norm": 0.7573529411764706,
"acc_norm_stderr": 0.026040662474201247
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.01755581809132228,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.01755581809132228
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7836734693877551,
"acc_stderr": 0.02635891633490403,
"acc_norm": 0.7836734693877551,
"acc_norm_stderr": 0.02635891633490403
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101716,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101716
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.03878626771002361,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.03878626771002361
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.025679342723276915,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.025679342723276915
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4455324357405141,
"mc1_stderr": 0.017399335280140354,
"mc2": 0.6245089770845819,
"mc2_stderr": 0.01502641583909722
},
"harness|winogrande|5": {
"acc": 0.8050513022888713,
"acc_stderr": 0.011134099415938278
},
"harness|gsm8k|5": {
"acc": 0.5064442759666414,
"acc_stderr": 0.013771340765699773
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.