id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
asus-aics/QALM | 2023-09-15T07:46:43.000Z | [
"region:us"
] | asus-aics | null | null | null | 0 | 0 | Citations to various datasets and documentation to be added |
CyberHarem/yuuki_haru_idolmastercinderellagirls | 2023-09-17T17:38:42.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of yuuki_haru (THE iDOLM@STER: Cinderella Girls)
This is the dataset of yuuki_haru (THE iDOLM@STER: Cinderella Girls), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 544 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 544 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 544 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 544 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
goodfellowliu/Flickr2K | 2023-09-15T07:58:13.000Z | [
"license:apache-2.0",
"region:us"
] | goodfellowliu | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
CyberHarem/yokoyama_chika_idolmastercinderellagirls | 2023-09-17T17:38:44.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of yokoyama_chika (THE iDOLM@STER: Cinderella Girls)
This is the dataset of yokoyama_chika (THE iDOLM@STER: Cinderella Girls), containing 84 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 84 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 234 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 84 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 84 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 84 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 84 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 84 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 234 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 234 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 234 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/saionji_kotoka_idolmastercinderellagirls | 2023-09-17T17:38:46.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of saionji_kotoka (THE iDOLM@STER: Cinderella Girls)
This is the dataset of saionji_kotoka (THE iDOLM@STER: Cinderella Girls), containing 81 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 81 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 215 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 81 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 81 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 81 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 81 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 81 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 215 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 215 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 215 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/etou_misaki_idolmastercinderellagirls | 2023-09-17T17:38:48.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of etou_misaki (THE iDOLM@STER: Cinderella Girls)
This is the dataset of etou_misaki (THE iDOLM@STER: Cinderella Girls), containing 29 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 29 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 71 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 29 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 29 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 29 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 29 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 29 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 71 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 71 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 71 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
edbeeching/gia-dataset-tokenized-2024-2 | 2023-09-15T11:03:29.000Z | [
"region:us"
] | edbeeching | null | null | null | 0 | 0 | ---
dataset_info:
- config_name: atari-alien
features:
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: loss_mask
sequence: bool
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: input_ids
sequence: int32
- name: input_types
sequence: int64
- name: local_positions
sequence: int64
- name: attention_mask
sequence: bool
splits:
- name: test
num_bytes: 2427492496
num_examples: 1836
download_size: 197411801
dataset_size: 2427492496
- config_name: atari-amidar
features:
- name: loss_mask
sequence: bool
- name: local_positions
sequence: int64
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: input_ids
sequence: int32
- name: input_types
sequence: int64
- name: attention_mask
sequence: bool
splits:
- name: train
num_bytes: 23292403388
num_examples: 17641
- name: test
num_bytes: 2157941388
num_examples: 1637
download_size: 1619960876
dataset_size: 25450344776
- config_name: atari-assault
features:
- name: loss_mask
sequence: bool
- name: local_positions
sequence: int64
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: input_ids
sequence: int32
- name: input_types
sequence: int64
- name: attention_mask
sequence: bool
splits:
- name: train
num_bytes: 23077576568
num_examples: 17434
- name: test
num_bytes: 1898092400
num_examples: 1436
download_size: 760479036
dataset_size: 24975668968
- config_name: atari-asterix
features:
- name: local_positions
sequence: int64
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: input_types
sequence: int64
- name: input_ids
sequence: int32
- name: loss_mask
sequence: bool
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: attention_mask
sequence: bool
splits:
- name: train
num_bytes: 25094377660
num_examples: 19161
download_size: 943683526
dataset_size: 25094377660
- config_name: atari-asteroids
features:
- name: local_positions
sequence: int64
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: input_types
sequence: int64
- name: input_ids
sequence: int32
- name: loss_mask
sequence: bool
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: attention_mask
sequence: bool
splits:
- name: train
num_bytes: 22677165856
num_examples: 17112
download_size: 807221186
dataset_size: 22677165856
- config_name: atari-atlantis
features:
- name: local_positions
sequence: int64
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: input_types
sequence: int64
- name: input_ids
sequence: int32
- name: loss_mask
sequence: bool
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: attention_mask
sequence: bool
splits:
- name: train
num_bytes: 22825149408
num_examples: 17240
download_size: 745609354
dataset_size: 22825149408
- config_name: atari-bankheist
features:
- name: input_types
sequence: int64
- name: local_positions
sequence: int64
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: input_ids
sequence: int32
- name: loss_mask
sequence: bool
- name: attention_mask
sequence: bool
splits:
- name: train
num_bytes: 23741888116
num_examples: 18043
- name: test
num_bytes: 2701097304
num_examples: 2050
download_size: 2847993069
dataset_size: 26442985420
- config_name: atari-battlezone
features:
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: local_positions
sequence: int64
- name: loss_mask
sequence: bool
- name: input_types
sequence: int64
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: bool
splits:
- name: test
num_bytes: 2683381416
num_examples: 2030
download_size: 162167846
dataset_size: 2683381416
- config_name: atari-berzerk
features:
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: loss_mask
sequence: bool
- name: local_positions
sequence: int64
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: input_types
sequence: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: bool
splits:
- name: test
num_bytes: 2683232284
num_examples: 2025
download_size: 98071291
dataset_size: 2683232284
- config_name: atari-bowling
features:
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: loss_mask
sequence: bool
- name: local_positions
sequence: int64
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: input_types
sequence: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: bool
splits:
- name: test
num_bytes: 2638612892
num_examples: 2001
download_size: 57099861
dataset_size: 2638612892
- config_name: atari-boxing
features:
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: loss_mask
sequence: bool
- name: local_positions
sequence: int64
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: input_types
sequence: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: bool
splits:
- name: test
num_bytes: 2925635312
num_examples: 2252
download_size: 154591181
dataset_size: 2925635312
- config_name: atari-breakout
features:
- name: loss_mask
sequence: bool
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: input_types
sequence: int64
- name: input_ids
sequence: int32
- name: local_positions
sequence: int64
- name: attention_mask
sequence: bool
splits:
- name: train
num_bytes: 21372025124
num_examples: 16135
- name: test
num_bytes: 2843462328
num_examples: 2146
download_size: 740521401
dataset_size: 24215487452
- config_name: atari-centipede
features:
- name: loss_mask
sequence: bool
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: input_types
sequence: int64
- name: input_ids
sequence: int32
- name: local_positions
sequence: int64
- name: attention_mask
sequence: bool
splits:
- name: train
num_bytes: 24525541956
num_examples: 18727
- name: test
num_bytes: 2743854332
num_examples: 2097
download_size: 886355860
dataset_size: 27269396288
- config_name: atari-choppercommand
features:
- name: loss_mask
sequence: bool
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: input_types
sequence: int64
- name: input_ids
sequence: int32
- name: local_positions
sequence: int64
- name: attention_mask
sequence: bool
splits:
- name: train
num_bytes: 21916144968
num_examples: 16598
- name: test
num_bytes: 3130204472
num_examples: 2370
download_size: 1120222280
dataset_size: 25046349440
- config_name: atari-crazyclimber
features:
- name: input_types
sequence: int64
- name: loss_mask
sequence: bool
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: local_positions
sequence: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: bool
splits:
- name: test
num_bytes: 2452295076
num_examples: 1855
download_size: 147409815
dataset_size: 2452295076
- config_name: atari-defender
features:
- name: input_types
sequence: int64
- name: loss_mask
sequence: bool
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: local_positions
sequence: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: bool
splits:
- name: test
num_bytes: 2667101644
num_examples: 2013
download_size: 76162534
dataset_size: 2667101644
- config_name: atari-demonattack
features:
- name: input_types
sequence: int64
- name: loss_mask
sequence: bool
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: local_positions
sequence: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: bool
splits:
- name: test
num_bytes: 2655965584
num_examples: 2004
download_size: 71540075
dataset_size: 2655965584
- config_name: atari-doubledunk
features:
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: local_positions
sequence: int64
- name: input_ids
sequence: int32
- name: input_types
sequence: int64
- name: loss_mask
sequence: bool
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: attention_mask
sequence: bool
splits:
- name: test
num_bytes: 2654251456
num_examples: 2032
download_size: 140407266
dataset_size: 2654251456
- config_name: atari-fishingderby
features:
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: local_positions
sequence: int64
- name: input_ids
sequence: int32
- name: input_types
sequence: int64
- name: loss_mask
sequence: bool
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: attention_mask
sequence: bool
splits:
- name: test
num_bytes: 2865449308
num_examples: 2177
download_size: 236590614
dataset_size: 2865449308
- config_name: atari-freeway
features:
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: local_positions
sequence: int64
- name: input_ids
sequence: int32
- name: input_types
sequence: int64
- name: loss_mask
sequence: bool
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: attention_mask
sequence: bool
splits:
- name: test
num_bytes: 2646386200
num_examples: 2002
download_size: 182728240
dataset_size: 2646386200
- config_name: atari-frostbite
features:
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: loss_mask
sequence: bool
- name: input_ids
sequence: int32
- name: input_types
sequence: int64
- name: local_positions
sequence: int64
- name: attention_mask
sequence: bool
splits:
- name: train
num_bytes: 23145553316
num_examples: 17551
- name: test
num_bytes: 2683086716
num_examples: 2033
download_size: 1661407235
dataset_size: 25828640032
- config_name: atari-gravitar
features:
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: loss_mask
sequence: bool
- name: input_ids
sequence: int32
- name: input_types
sequence: int64
- name: local_positions
sequence: int64
- name: attention_mask
sequence: bool
splits:
- name: train
num_bytes: 26186279752
num_examples: 20126
- name: test
num_bytes: 2990268724
num_examples: 2299
download_size: 939142901
dataset_size: 29176548476
- config_name: atari-hero
features:
- name: input_ids
sequence: int32
- name: loss_mask
sequence: bool
- name: local_positions
sequence: int64
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: input_types
sequence: int64
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: attention_mask
sequence: bool
splits:
- name: test
num_bytes: 2756503068
num_examples: 2089
download_size: 131026317
dataset_size: 2756503068
- config_name: atari-icehockey
features:
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: input_types
sequence: int64
- name: input_ids
sequence: int32
- name: local_positions
sequence: int64
- name: loss_mask
sequence: bool
- name: attention_mask
sequence: bool
splits:
- name: test
num_bytes: 2538945980
num_examples: 1921
download_size: 89405392
dataset_size: 2538945980
- config_name: atari-jamesbond
features:
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: input_types
sequence: int64
- name: input_ids
sequence: int32
- name: local_positions
sequence: int64
- name: loss_mask
sequence: bool
- name: attention_mask
sequence: bool
splits:
- name: test
num_bytes: 4473778328
num_examples: 3378
download_size: 224917482
dataset_size: 4473778328
- config_name: atari-kangaroo
features:
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: input_types
sequence: int64
- name: input_ids
sequence: int32
- name: local_positions
sequence: int64
- name: loss_mask
sequence: bool
- name: attention_mask
sequence: bool
splits:
- name: test
num_bytes: 2993217516
num_examples: 2285
download_size: 140119408
dataset_size: 2993217516
- config_name: atari-mspacman
features:
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: input_ids
sequence: int32
- name: local_positions
sequence: int64
- name: loss_mask
sequence: bool
- name: input_types
sequence: int64
- name: attention_mask
sequence: bool
splits:
- name: test
num_bytes: 2479651844
num_examples: 1879
download_size: 217259145
dataset_size: 2479651844
- config_name: atari-namethisgame
features:
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: input_ids
sequence: int32
- name: local_positions
sequence: int64
- name: loss_mask
sequence: bool
- name: input_types
sequence: int64
- name: attention_mask
sequence: bool
splits:
- name: test
num_bytes: 3006648420
num_examples: 2271
download_size: 158870157
dataset_size: 3006648420
- config_name: atari-phoenix
features:
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: input_ids
sequence: int32
- name: local_positions
sequence: int64
- name: loss_mask
sequence: bool
- name: input_types
sequence: int64
- name: attention_mask
sequence: bool
splits:
- name: test
num_bytes: 2655773200
num_examples: 2004
download_size: 79861580
dataset_size: 2655773200
- config_name: atari-qbert
features:
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: input_types
sequence: int64
- name: loss_mask
sequence: bool
- name: local_positions
sequence: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: bool
splits:
- name: test
num_bytes: 2547887868
num_examples: 1929
download_size: 174392419
dataset_size: 2547887868
- config_name: atari-riverraid
features:
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: input_types
sequence: int64
- name: loss_mask
sequence: bool
- name: local_positions
sequence: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: bool
splits:
- name: test
num_bytes: 2555182372
num_examples: 1943
download_size: 174672084
dataset_size: 2555182372
- config_name: atari-roadrunner
features:
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: input_types
sequence: int64
- name: loss_mask
sequence: bool
- name: local_positions
sequence: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: bool
splits:
- name: test
num_bytes: 2521407028
num_examples: 1915
download_size: 125390334
dataset_size: 2521407028
- config_name: atari-robotank
features:
- name: input_ids
sequence: int32
- name: local_positions
sequence: int64
- name: loss_mask
sequence: bool
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: input_types
sequence: int64
- name: attention_mask
sequence: bool
splits:
- name: train
num_bytes: 22475017052
num_examples: 16985
- name: test
num_bytes: 2229677068
num_examples: 1685
download_size: 1298755118
dataset_size: 24704694120
- config_name: atari-seaquest
features:
- name: input_ids
sequence: int32
- name: local_positions
sequence: int64
- name: loss_mask
sequence: bool
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: input_types
sequence: int64
- name: attention_mask
sequence: bool
splits:
- name: train
num_bytes: 23841045496
num_examples: 18114
- name: test
num_bytes: 2738008960
num_examples: 2080
download_size: 910338340
dataset_size: 26579054456
- config_name: atari-skiing
features:
- name: input_ids
sequence: int32
- name: local_positions
sequence: int64
- name: loss_mask
sequence: bool
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: input_types
sequence: int64
- name: attention_mask
sequence: bool
splits:
- name: train
num_bytes: 26305597476
num_examples: 20359
- name: test
num_bytes: 2941523916
num_examples: 2277
download_size: 1797518108
dataset_size: 29247121392
- config_name: atari-solaris
features:
- name: input_types
sequence: int64
- name: input_ids
sequence: int32
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: local_positions
sequence: int64
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: loss_mask
sequence: bool
- name: attention_mask
sequence: bool
splits:
- name: test
num_bytes: 2273188716
num_examples: 1717
download_size: 126936781
dataset_size: 2273188716
- config_name: atari-spaceinvaders
features:
- name: input_types
sequence: int64
- name: input_ids
sequence: int32
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: local_positions
sequence: int64
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: loss_mask
sequence: bool
- name: attention_mask
sequence: bool
splits:
- name: test
num_bytes: 4137369016
num_examples: 3122
download_size: 146426375
dataset_size: 4137369016
- config_name: atari-stargunner
features:
- name: input_types
sequence: int64
- name: input_ids
sequence: int32
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: local_positions
sequence: int64
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: loss_mask
sequence: bool
- name: attention_mask
sequence: bool
splits:
- name: test
num_bytes: 2565341980
num_examples: 1937
download_size: 72577790
dataset_size: 2565341980
- config_name: atari-surround
features:
- name: loss_mask
sequence: bool
- name: local_positions
sequence: int64
- name: input_types
sequence: int64
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: input_ids
sequence: int32
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: attention_mask
sequence: bool
splits:
- name: train
num_bytes: 22468793380
num_examples: 17023
- name: test
num_bytes: 2933488488
num_examples: 2222
download_size: 904796125
dataset_size: 25402281868
- config_name: atari-tennis
features:
- name: input_ids
sequence: int32
- name: local_positions
sequence: int64
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: loss_mask
sequence: bool
- name: input_types
sequence: int64
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: attention_mask
sequence: bool
splits:
- name: test
num_bytes: 2484015692
num_examples: 1877
download_size: 95167453
dataset_size: 2484015692
- config_name: atari-timepilot
features:
- name: input_ids
sequence: int32
- name: local_positions
sequence: int64
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: loss_mask
sequence: bool
- name: input_types
sequence: int64
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: attention_mask
sequence: bool
splits:
- name: test
num_bytes: 2558172240
num_examples: 1932
download_size: 86471773
dataset_size: 2558172240
- config_name: atari-tutankham
features:
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: local_positions
sequence: int64
- name: input_types
sequence: int64
- name: loss_mask
sequence: bool
- name: input_ids
sequence: int32
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: attention_mask
sequence: bool
splits:
- name: test
num_bytes: 3517105220
num_examples: 2655
download_size: 144491974
dataset_size: 3517105220
- config_name: atari-videopinball
features:
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: input_types
sequence: int64
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: local_positions
sequence: int64
- name: loss_mask
sequence: bool
- name: input_ids
sequence: int32
- name: attention_mask
sequence: bool
splits:
- name: train
num_bytes: 22581644248
num_examples: 17042
- name: test
num_bytes: 856644644
num_examples: 647
download_size: 1483962740
dataset_size: 23438288892
- config_name: atari-wizardofwor
features:
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: input_types
sequence: int64
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: local_positions
sequence: int64
- name: loss_mask
sequence: bool
- name: input_ids
sequence: int32
- name: attention_mask
sequence: bool
splits:
- name: train
num_bytes: 22744043928
num_examples: 17218
- name: test
num_bytes: 2648734220
num_examples: 2005
download_size: 1739703310
dataset_size: 25392778148
- config_name: atari-yarsrevenge
features:
- name: input_types
sequence: int64
- name: loss_mask
sequence: bool
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: local_positions
sequence: int64
- name: input_ids
sequence: int32
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: attention_mask
sequence: bool
splits:
- name: train
num_bytes: 22080700236
num_examples: 16669
- name: test
num_bytes: 2579104820
num_examples: 1947
download_size: 3451148232
dataset_size: 24659805056
- config_name: atari-zaxxon
features:
- name: input_types
sequence: int64
- name: loss_mask
sequence: bool
- name: patch_positions
sequence:
sequence:
sequence: float64
- name: local_positions
sequence: int64
- name: input_ids
sequence: int32
- name: patches
sequence:
sequence:
sequence:
sequence: uint8
- name: attention_mask
sequence: bool
splits:
- name: train
num_bytes: 22058040148
num_examples: 16667
- name: test
num_bytes: 2768806832
num_examples: 2092
download_size: 1229966010
dataset_size: 24826846980
configs:
- config_name: atari-alien
data_files:
- split: test
path: atari-alien/test-*
- config_name: atari-amidar
data_files:
- split: train
path: atari-amidar/train-*
- split: test
path: atari-amidar/test-*
- config_name: atari-assault
data_files:
- split: train
path: atari-assault/train-*
- split: test
path: atari-assault/test-*
- config_name: atari-asterix
data_files:
- split: train
path: atari-asterix/train-*
- config_name: atari-asteroids
data_files:
- split: train
path: atari-asteroids/train-*
- config_name: atari-atlantis
data_files:
- split: train
path: atari-atlantis/train-*
- config_name: atari-bankheist
data_files:
- split: train
path: atari-bankheist/train-*
- split: test
path: atari-bankheist/test-*
- config_name: atari-battlezone
data_files:
- split: test
path: atari-battlezone/test-*
- config_name: atari-berzerk
data_files:
- split: test
path: atari-berzerk/test-*
- config_name: atari-bowling
data_files:
- split: test
path: atari-bowling/test-*
- config_name: atari-boxing
data_files:
- split: test
path: atari-boxing/test-*
- config_name: atari-breakout
data_files:
- split: train
path: atari-breakout/train-*
- split: test
path: atari-breakout/test-*
- config_name: atari-centipede
data_files:
- split: train
path: atari-centipede/train-*
- split: test
path: atari-centipede/test-*
- config_name: atari-choppercommand
data_files:
- split: train
path: atari-choppercommand/train-*
- split: test
path: atari-choppercommand/test-*
- config_name: atari-crazyclimber
data_files:
- split: test
path: atari-crazyclimber/test-*
- config_name: atari-defender
data_files:
- split: test
path: atari-defender/test-*
- config_name: atari-demonattack
data_files:
- split: test
path: atari-demonattack/test-*
- config_name: atari-doubledunk
data_files:
- split: test
path: atari-doubledunk/test-*
- config_name: atari-fishingderby
data_files:
- split: test
path: atari-fishingderby/test-*
- config_name: atari-freeway
data_files:
- split: test
path: atari-freeway/test-*
- config_name: atari-frostbite
data_files:
- split: train
path: atari-frostbite/train-*
- split: test
path: atari-frostbite/test-*
- config_name: atari-gravitar
data_files:
- split: train
path: atari-gravitar/train-*
- split: test
path: atari-gravitar/test-*
- config_name: atari-hero
data_files:
- split: test
path: atari-hero/test-*
- config_name: atari-icehockey
data_files:
- split: test
path: atari-icehockey/test-*
- config_name: atari-jamesbond
data_files:
- split: test
path: atari-jamesbond/test-*
- config_name: atari-kangaroo
data_files:
- split: test
path: atari-kangaroo/test-*
- config_name: atari-mspacman
data_files:
- split: test
path: atari-mspacman/test-*
- config_name: atari-namethisgame
data_files:
- split: test
path: atari-namethisgame/test-*
- config_name: atari-phoenix
data_files:
- split: test
path: atari-phoenix/test-*
- config_name: atari-qbert
data_files:
- split: test
path: atari-qbert/test-*
- config_name: atari-riverraid
data_files:
- split: test
path: atari-riverraid/test-*
- config_name: atari-roadrunner
data_files:
- split: test
path: atari-roadrunner/test-*
- config_name: atari-robotank
data_files:
- split: train
path: atari-robotank/train-*
- split: test
path: atari-robotank/test-*
- config_name: atari-seaquest
data_files:
- split: train
path: atari-seaquest/train-*
- split: test
path: atari-seaquest/test-*
- config_name: atari-skiing
data_files:
- split: train
path: atari-skiing/train-*
- split: test
path: atari-skiing/test-*
- config_name: atari-solaris
data_files:
- split: test
path: atari-solaris/test-*
- config_name: atari-spaceinvaders
data_files:
- split: test
path: atari-spaceinvaders/test-*
- config_name: atari-stargunner
data_files:
- split: test
path: atari-stargunner/test-*
- config_name: atari-surround
data_files:
- split: train
path: atari-surround/train-*
- split: test
path: atari-surround/test-*
- config_name: atari-tennis
data_files:
- split: test
path: atari-tennis/test-*
- config_name: atari-timepilot
data_files:
- split: test
path: atari-timepilot/test-*
- config_name: atari-tutankham
data_files:
- split: test
path: atari-tutankham/test-*
- config_name: atari-videopinball
data_files:
- split: train
path: atari-videopinball/train-*
- split: test
path: atari-videopinball/test-*
- config_name: atari-wizardofwor
data_files:
- split: train
path: atari-wizardofwor/train-*
- split: test
path: atari-wizardofwor/test-*
- config_name: atari-yarsrevenge
data_files:
- split: train
path: atari-yarsrevenge/train-*
- split: test
path: atari-yarsrevenge/test-*
- config_name: atari-zaxxon
data_files:
- split: train
path: atari-zaxxon/train-*
- split: test
path: atari-zaxxon/test-*
---
# Dataset Card for "gia-dataset-tokenized-2024-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/matsuo_chizuru_idolmastercinderellagirls | 2023-09-17T17:38:51.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of matsuo_chizuru (THE iDOLM@STER: Cinderella Girls)
This is the dataset of matsuo_chizuru (THE iDOLM@STER: Cinderella Girls), containing 87 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 87 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 223 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 87 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 87 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 87 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 87 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 87 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 223 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 223 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 223 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
aboix/q76_campnow_downsampled_noduplicates_upsampled_deletedtest | 2023-09-15T08:15:45.000Z | [
"region:us"
] | aboix | null | null | null | 0 | 0 | Entry not found |
CyberHarem/mizuki_seira_idolmastercinderellagirls | 2023-09-17T17:38:53.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mizuki_seira (THE iDOLM@STER: Cinderella Girls)
This is the dataset of mizuki_seira (THE iDOLM@STER: Cinderella Girls), containing 164 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 164 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 441 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 164 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 164 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 164 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 164 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 164 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 441 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 441 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 441 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
BSC-LT/aguila7b-private-inference | 2023-09-21T09:19:55.000Z | [
"region:us"
] | BSC-LT | null | null | null | 0 | 0 |
# Aguila7b Private Inference
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
cestwc/sample | 2023-09-15T08:50:44.000Z | [
"region:us"
] | cestwc | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: Unique Case Record Key
dtype: int64
- name: Description
dtype: string
- name: Subject
dtype: string
- name: Reporting Sub Category
dtype:
class_label:
names:
'0': CRC Issues
'1': Pedestrian Paths/POB/Linkway - Requests
'2': Parks Infra - Others
'3': Cockroaches
'4': EPS Malfunction
'5': Illegal Parking - Public Housing Disabled Lots
'6': Abandoned Bulky Items/Dumping
'7': Idling Engines
'8': Illegal Parking - Public Housing General Reserved Lots
'9': Noise - Events
'10': Illegal Activities - Others
'11': Hoarding
'12': Tree Removal
'13': Public Housing Infra - Others
'14': Rodents
'15': Playground/Fitness Equipment - Public Housing
'16': Grass Cutting
'17': Public Toilet Issues
'18': Hazardous Toxic
'19': Water Supply and Pressure - Other Public Areas
'20': Smell/Smoky - Food Establishments/Cooking
'21': Illegal Parking - Roads
'22': Illegal Parking - Public Housing Loading and Unloading Bays
'23': Water - Others
'24': Graffiti/Stains
'25': Obstruction of Public Accessibility by Articles
'26': PMDs/PABs/Bicycles Usage Issues
'27': Dust/Smell/Light - Construction
'28': BCA - Building and Construction Matters
'29': Flooding/Ponding
'30': Smell - Drains/Canals/Sewer/Manhole
'31': Noise - Food Establishments/Entertainment Outlets
'32': Bus Shelters - Maintenance
'33': CCTV Issues
'34': Neighbour Disputes
'35': Dogs - Nuisance
'36': Dust - Others
'37': Electricity Supply
'38': Obstruction - Public Housing Common Areas
'39': Pollution - Others
'40': Noise - Others
'41': Noise - Renovation
'42': Water Pipe Maintenance and Issues - Public Housing
'43': Tree Planting
'44': Car Park - Maintenance
'45': Roads/Structures - Maintenance
'46': Street Lights - Maintenance
'47': Dirty Areas/Litter - Other Public Areas
'48': Illegal Advertisements
'49': High Rise Littering/Killer Litter
'50': Lift - Others
'51': Birds - Nuisance
'52': Ceiling Leak
'53': Animals - Others
'54': Traffic Lights - Maintenance
'55': Connectivity Related Infrastructure - Others
'56': Illegal Parking - Motorcycles at Public Housing Common Areas
'57': Pedestrian Crossings
'58': Electrical - Others
'59': Cats - Nuisance
'60': Noise - Construction
'61': Sewer - Other Public Areas
'62': Spalling Concrete - Public Housing Common Areas
'63': Wall Seepage
'64': Urine/Faeces/Spitting
'65': Sewer - Public Housing
'66': Noise - Neighbours
'67': Bees/Wasps/Hornets
'68': Lift - Breakdown
'69': Dead Animals/Birds
'70': Tree/Shrub Maintenance
'71': Corridor Lighting
'72': Bus Shelters - Requests
'73': Car Park - Requests
'74': Drains/Drainage - Public Housing
'75': Spalling Concrete - Within HDB Flat
'76': Dirty Areas/Litter - Public Housing
'77': Water Pipe Maintenance and Issues - Other Public Areas
'78': Traffic Lights - Requests
'79': Waste Pipe Defects - Public Housing
'80': Waste and Recycling Management
'81': Fallen Tree/Branch
'82': Infra - Others
'83': Building Defects
'84': Wet Laundry
'85': Illegal Parking - Heavy Vehicle Parking at Public Housing
'86': Outdoor Lighting
'87': Dirty Drains/Canals
'88': Pedestrian Paths/POB/Linkway - Maintenance
'89': Noise - Congregation in Common Areas
'90': Smoking
'91': Bins/Recycling
'92': Road Works
'93': Illegal Parking - Serious Obstruction
'94': Smell - Other Sources
'95': Pests - Others
'96': Road Signs - Maintenance
'97': Water Quality - Other Public Areas
'98': Air Pollution/Smoke
'99': Drains/Drainage - Other Public Areas
'100': Illegal Parking - Public Housing Car Parks/Service Roads
'101': Mosquitoes
'102': Water Supply and Pressure - Public Housing
'103': Parks Infra - Lighting
- name: Reporting Category
dtype:
class_label:
names:
'0': Cleanliness
'1': Enforcement Matters
'2': Pests
'3': Pollution
'4': General Infrastructure/Facilities
'5': Public Housing Lifts
'6': Connectivity Related Infrastructure
'7': Animals and Birds
'8': Public Housing Infrastructure (Excl Lifts)
'9': Greenery
'10': Neighbour Issues
'11': Illegal Parking
'12': Noise
- name: Preprocessed
dtype: string
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: labels
dtype: int64
- name: tags
sequence:
sequence: float64
splits:
- name: '2017'
num_bytes: 33338600
num_examples: 45675
download_size: 16008523
dataset_size: 33338600
configs:
- config_name: default
data_files:
- split: '2017'
path: data/2017-*
---
# Dataset Card for "sample"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AnthonyRayo/TestforDocumentation | 2023-09-15T08:52:08.000Z | [
"region:us"
] | AnthonyRayo | null | null | null | 0 | 0 | Entry not found |
CyberHarem/kita_hinako_idolmastercinderellagirls | 2023-09-17T17:38:55.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kita_hinako (THE iDOLM@STER: Cinderella Girls)
This is the dataset of kita_hinako (THE iDOLM@STER: Cinderella Girls), containing 71 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 71 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 193 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 71 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 71 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 71 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 71 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 71 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 193 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 193 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 193 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/fukuyama_mai_idolmastercinderellagirls | 2023-09-17T17:38:57.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of fukuyama_mai (THE iDOLM@STER: Cinderella Girls)
This is the dataset of fukuyama_mai (THE iDOLM@STER: Cinderella Girls), containing 137 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 137 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 381 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 137 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 137 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 137 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 137 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 137 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 381 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 381 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 381 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
sorimanse/Sori | 2023-09-15T09:26:38.000Z | [
"region:us"
] | sorimanse | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_lloorree__jfdslijsijdgis | 2023-09-17T00:36:07.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of lloorree/jfdslijsijdgis
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lloorree/jfdslijsijdgis](https://huggingface.co/lloorree/jfdslijsijdgis) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lloorree__jfdslijsijdgis\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-17T00:34:49.304226](https://huggingface.co/datasets/open-llm-leaderboard/details_lloorree__jfdslijsijdgis/blob/main/results_2023-09-17T00-34-49.304226.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6907933129316588,\n\
\ \"acc_stderr\": 0.03107455661224763,\n \"acc_norm\": 0.694824769775718,\n\
\ \"acc_norm_stderr\": 0.031044197474221744,\n \"mc1\": 0.41615667074663404,\n\
\ \"mc1_stderr\": 0.017255657502903043,\n \"mc2\": 0.5820460749080146,\n\
\ \"mc2_stderr\": 0.015030523772190541\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6518771331058021,\n \"acc_stderr\": 0.01392100859517935,\n\
\ \"acc_norm\": 0.6962457337883959,\n \"acc_norm_stderr\": 0.013438909184778764\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6760605457080263,\n\
\ \"acc_stderr\": 0.00467020812857923,\n \"acc_norm\": 0.8695478988249352,\n\
\ \"acc_norm_stderr\": 0.0033611183954523846\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.04171654161354543,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.04171654161354543\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8223684210526315,\n \"acc_stderr\": 0.03110318238312338,\n\
\ \"acc_norm\": 0.8223684210526315,\n \"acc_norm_stderr\": 0.03110318238312338\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8125,\n\
\ \"acc_stderr\": 0.032639560491693344,\n \"acc_norm\": 0.8125,\n\
\ \"acc_norm_stderr\": 0.032639560491693344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736413,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736413\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.041633319989322626\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6638297872340425,\n \"acc_stderr\": 0.030881618520676942,\n\
\ \"acc_norm\": 0.6638297872340425,\n \"acc_norm_stderr\": 0.030881618520676942\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.040434618619167466,\n\
\ \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.040434618619167466\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.025591857761382182,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.025591857761382182\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8064516129032258,\n\
\ \"acc_stderr\": 0.022475258525536057,\n \"acc_norm\": 0.8064516129032258,\n\
\ \"acc_norm_stderr\": 0.022475258525536057\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\"\
: 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781668,\n\
\ \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781668\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"\
acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9533678756476683,\n \"acc_stderr\": 0.015216761819262592,\n\
\ \"acc_norm\": 0.9533678756476683,\n \"acc_norm_stderr\": 0.015216761819262592\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7153846153846154,\n \"acc_stderr\": 0.022878322799706304,\n\
\ \"acc_norm\": 0.7153846153846154,\n \"acc_norm_stderr\": 0.022878322799706304\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7815126050420168,\n \"acc_stderr\": 0.026841514322958934,\n\
\ \"acc_norm\": 0.7815126050420168,\n \"acc_norm_stderr\": 0.026841514322958934\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.423841059602649,\n \"acc_stderr\": 0.04034846678603397,\n \"acc_norm\"\
: 0.423841059602649,\n \"acc_norm_stderr\": 0.04034846678603397\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8935779816513761,\n\
\ \"acc_stderr\": 0.013221554674594372,\n \"acc_norm\": 0.8935779816513761,\n\
\ \"acc_norm_stderr\": 0.013221554674594372\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.6018518518518519,\n \"acc_stderr\": 0.033384734032074016,\n\
\ \"acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.033384734032074016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9117647058823529,\n \"acc_stderr\": 0.01990739979131695,\n \"\
acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.01990739979131695\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8565400843881856,\n \"acc_stderr\": 0.022818291821017012,\n \
\ \"acc_norm\": 0.8565400843881856,\n \"acc_norm_stderr\": 0.022818291821017012\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n\
\ \"acc_stderr\": 0.026936111912802263,\n \"acc_norm\": 0.7982062780269058,\n\
\ \"acc_norm_stderr\": 0.026936111912802263\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8473282442748091,\n \"acc_stderr\": 0.031545216720054725,\n\
\ \"acc_norm\": 0.8473282442748091,\n \"acc_norm_stderr\": 0.031545216720054725\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.859504132231405,\n \"acc_stderr\": 0.03172233426002158,\n \"acc_norm\"\
: 0.859504132231405,\n \"acc_norm_stderr\": 0.03172233426002158\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n\
\ \"acc_stderr\": 0.03520703990517964,\n \"acc_norm\": 0.8425925925925926,\n\
\ \"acc_norm_stderr\": 0.03520703990517964\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971726,\n\
\ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971726\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.0376017800602662,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.0376017800602662\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128138,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128138\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8659003831417624,\n\
\ \"acc_stderr\": 0.012185528166499978,\n \"acc_norm\": 0.8659003831417624,\n\
\ \"acc_norm_stderr\": 0.012185528166499978\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7658959537572254,\n \"acc_stderr\": 0.022797110278071124,\n\
\ \"acc_norm\": 0.7658959537572254,\n \"acc_norm_stderr\": 0.022797110278071124\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.511731843575419,\n\
\ \"acc_stderr\": 0.016717897676932162,\n \"acc_norm\": 0.511731843575419,\n\
\ \"acc_norm_stderr\": 0.016717897676932162\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7845659163987139,\n\
\ \"acc_stderr\": 0.023350225475471442,\n \"acc_norm\": 0.7845659163987139,\n\
\ \"acc_norm_stderr\": 0.023350225475471442\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.021185893615225188,\n\
\ \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.021185893615225188\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5177304964539007,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.5177304964539007,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5397653194263363,\n\
\ \"acc_stderr\": 0.012729785386598545,\n \"acc_norm\": 0.5397653194263363,\n\
\ \"acc_norm_stderr\": 0.012729785386598545\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7169117647058824,\n \"acc_stderr\": 0.02736586113151381,\n\
\ \"acc_norm\": 0.7169117647058824,\n \"acc_norm_stderr\": 0.02736586113151381\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.75,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.75,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8122448979591836,\n\
\ \"acc_stderr\": 0.025000256039546195,\n \"acc_norm\": 0.8122448979591836,\n\
\ \"acc_norm_stderr\": 0.025000256039546195\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.900497512437811,\n \"acc_stderr\": 0.0211662163046594,\n\
\ \"acc_norm\": 0.900497512437811,\n \"acc_norm_stderr\": 0.0211662163046594\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n\
\ \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n\
\ \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8713450292397661,\n\
\ \"acc_stderr\": 0.02567934272327692,\n \"acc_norm\": 0.8713450292397661,\n\
\ \"acc_norm_stderr\": 0.02567934272327692\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.41615667074663404,\n \"mc1_stderr\": 0.017255657502903043,\n\
\ \"mc2\": 0.5820460749080146,\n \"mc2_stderr\": 0.015030523772190541\n\
\ }\n}\n```"
repo_url: https://huggingface.co/lloorree/jfdslijsijdgis
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|arc:challenge|25_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|arc:challenge|25_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hellaswag|10_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hellaswag|10_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-15T09-43-22.432852.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-17T00-34-49.304226.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-17T00-34-49.304226.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-15T09-43-22.432852.parquet'
- split: 2023_09_17T00_34_49.304226
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-17T00-34-49.304226.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-17T00-34-49.304226.parquet'
- config_name: results
data_files:
- split: 2023_09_15T09_43_22.432852
path:
- results_2023-09-15T09-43-22.432852.parquet
- split: 2023_09_17T00_34_49.304226
path:
- results_2023-09-17T00-34-49.304226.parquet
- split: latest
path:
- results_2023-09-17T00-34-49.304226.parquet
---
# Dataset Card for Evaluation run of lloorree/jfdslijsijdgis
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lloorree/jfdslijsijdgis
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lloorree/jfdslijsijdgis](https://huggingface.co/lloorree/jfdslijsijdgis) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lloorree__jfdslijsijdgis",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T00:34:49.304226](https://huggingface.co/datasets/open-llm-leaderboard/details_lloorree__jfdslijsijdgis/blob/main/results_2023-09-17T00-34-49.304226.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6907933129316588,
"acc_stderr": 0.03107455661224763,
"acc_norm": 0.694824769775718,
"acc_norm_stderr": 0.031044197474221744,
"mc1": 0.41615667074663404,
"mc1_stderr": 0.017255657502903043,
"mc2": 0.5820460749080146,
"mc2_stderr": 0.015030523772190541
},
"harness|arc:challenge|25": {
"acc": 0.6518771331058021,
"acc_stderr": 0.01392100859517935,
"acc_norm": 0.6962457337883959,
"acc_norm_stderr": 0.013438909184778764
},
"harness|hellaswag|10": {
"acc": 0.6760605457080263,
"acc_stderr": 0.00467020812857923,
"acc_norm": 0.8695478988249352,
"acc_norm_stderr": 0.0033611183954523846
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04171654161354543,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04171654161354543
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8223684210526315,
"acc_stderr": 0.03110318238312338,
"acc_norm": 0.8223684210526315,
"acc_norm_stderr": 0.03110318238312338
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8125,
"acc_stderr": 0.032639560491693344,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.032639560491693344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736413,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736413
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6638297872340425,
"acc_stderr": 0.030881618520676942,
"acc_norm": 0.6638297872340425,
"acc_norm_stderr": 0.030881618520676942
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.040434618619167466,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.040434618619167466
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.025591857761382182,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.025591857761382182
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8064516129032258,
"acc_stderr": 0.022475258525536057,
"acc_norm": 0.8064516129032258,
"acc_norm_stderr": 0.022475258525536057
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781668,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781668
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9533678756476683,
"acc_stderr": 0.015216761819262592,
"acc_norm": 0.9533678756476683,
"acc_norm_stderr": 0.015216761819262592
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7153846153846154,
"acc_stderr": 0.022878322799706304,
"acc_norm": 0.7153846153846154,
"acc_norm_stderr": 0.022878322799706304
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7815126050420168,
"acc_stderr": 0.026841514322958934,
"acc_norm": 0.7815126050420168,
"acc_norm_stderr": 0.026841514322958934
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.423841059602649,
"acc_stderr": 0.04034846678603397,
"acc_norm": 0.423841059602649,
"acc_norm_stderr": 0.04034846678603397
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8935779816513761,
"acc_stderr": 0.013221554674594372,
"acc_norm": 0.8935779816513761,
"acc_norm_stderr": 0.013221554674594372
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.01990739979131695,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.01990739979131695
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8565400843881856,
"acc_stderr": 0.022818291821017012,
"acc_norm": 0.8565400843881856,
"acc_norm_stderr": 0.022818291821017012
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.026936111912802263,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.026936111912802263
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8473282442748091,
"acc_stderr": 0.031545216720054725,
"acc_norm": 0.8473282442748091,
"acc_norm_stderr": 0.031545216720054725
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.859504132231405,
"acc_stderr": 0.03172233426002158,
"acc_norm": 0.859504132231405,
"acc_norm_stderr": 0.03172233426002158
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.03520703990517964,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.03520703990517964
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.030446777687971726,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.030446777687971726
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5357142857142857,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.5357142857142857,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.0376017800602662,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.0376017800602662
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128138,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128138
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8659003831417624,
"acc_stderr": 0.012185528166499978,
"acc_norm": 0.8659003831417624,
"acc_norm_stderr": 0.012185528166499978
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7658959537572254,
"acc_stderr": 0.022797110278071124,
"acc_norm": 0.7658959537572254,
"acc_norm_stderr": 0.022797110278071124
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.511731843575419,
"acc_stderr": 0.016717897676932162,
"acc_norm": 0.511731843575419,
"acc_norm_stderr": 0.016717897676932162
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7845659163987139,
"acc_stderr": 0.023350225475471442,
"acc_norm": 0.7845659163987139,
"acc_norm_stderr": 0.023350225475471442
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.021185893615225188,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.021185893615225188
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5177304964539007,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.5177304964539007,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5397653194263363,
"acc_stderr": 0.012729785386598545,
"acc_norm": 0.5397653194263363,
"acc_norm_stderr": 0.012729785386598545
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7169117647058824,
"acc_stderr": 0.02736586113151381,
"acc_norm": 0.7169117647058824,
"acc_norm_stderr": 0.02736586113151381
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.75,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.75,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8122448979591836,
"acc_stderr": 0.025000256039546195,
"acc_norm": 0.8122448979591836,
"acc_norm_stderr": 0.025000256039546195
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.0211662163046594,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.0211662163046594
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.02567934272327692,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.02567934272327692
},
"harness|truthfulqa:mc|0": {
"mc1": 0.41615667074663404,
"mc1_stderr": 0.017255657502903043,
"mc2": 0.5820460749080146,
"mc2_stderr": 0.015030523772190541
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/koga_koharu_idolmastercinderellagirls | 2023-09-17T17:38:59.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of koga_koharu (THE iDOLM@STER: Cinderella Girls)
This is the dataset of koga_koharu (THE iDOLM@STER: Cinderella Girls), containing 82 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 82 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 203 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 82 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 82 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 82 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 82 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 82 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 203 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 203 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 203 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
qgyd2021/music_comment | 2023-09-19T03:34:24.000Z | [
"size_categories:100M<n<1B",
"language:zh",
"license:apache-2.0",
"music",
"region:us"
] | qgyd2021 | null | @dataset{music_comment,
author = {Xing Tian},
title = {music_comment},
month = sep,
year = 2023,
publisher = {Xing Tian},
version = {1.0},
} | null | 0 | 0 | ---
license: apache-2.0
language:
- zh
tags:
- music
size_categories:
- 100M<n<1B
---
## 49万港台内地歌曲信息
数据来源于 [QQMusicSpider](https://github.com/yangjianxin1/QQMusicSpider).
数据可用于:
* 根据歌手创作歌词.
* 根据歌名创作歌词.
* 根据歌名写评论.
|
BangumiBase/kumakumakumabear | 2023-09-29T08:08:18.000Z | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
] | BangumiBase | null | null | null | 0 | 0 | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Kuma Kuma Kuma Bear
This is the image base of bangumi Kuma Kuma Kuma Bear, we detected 99 characters, 6688 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 801 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 135 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 55 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 78 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 22 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 45 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 26 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 17 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 40 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 47 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 25 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 24 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 14 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 21 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 16 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 19 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 128 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 20 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 22 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 58 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 12 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 180 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 15 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 14 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 49 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 13 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 60 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 15 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 21 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 103 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 16 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 12 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 35 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 8 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 14 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 15 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 10 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 16 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 33 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 17 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 70 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 10 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 26 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 1939 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 105 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 22 | [Download](45/dataset.zip) |  |  |  |  |  |  |  |  |
| 46 | 36 | [Download](46/dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 38 | [Download](47/dataset.zip) |  |  |  |  |  |  |  |  |
| 48 | 8 | [Download](48/dataset.zip) |  |  |  |  |  |  |  |  |
| 49 | 7 | [Download](49/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 50 | 69 | [Download](50/dataset.zip) |  |  |  |  |  |  |  |  |
| 51 | 66 | [Download](51/dataset.zip) |  |  |  |  |  |  |  |  |
| 52 | 7 | [Download](52/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 53 | 22 | [Download](53/dataset.zip) |  |  |  |  |  |  |  |  |
| 54 | 7 | [Download](54/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 55 | 14 | [Download](55/dataset.zip) |  |  |  |  |  |  |  |  |
| 56 | 197 | [Download](56/dataset.zip) |  |  |  |  |  |  |  |  |
| 57 | 52 | [Download](57/dataset.zip) |  |  |  |  |  |  |  |  |
| 58 | 8 | [Download](58/dataset.zip) |  |  |  |  |  |  |  |  |
| 59 | 29 | [Download](59/dataset.zip) |  |  |  |  |  |  |  |  |
| 60 | 62 | [Download](60/dataset.zip) |  |  |  |  |  |  |  |  |
| 61 | 26 | [Download](61/dataset.zip) |  |  |  |  |  |  |  |  |
| 62 | 69 | [Download](62/dataset.zip) |  |  |  |  |  |  |  |  |
| 63 | 30 | [Download](63/dataset.zip) |  |  |  |  |  |  |  |  |
| 64 | 11 | [Download](64/dataset.zip) |  |  |  |  |  |  |  |  |
| 65 | 55 | [Download](65/dataset.zip) |  |  |  |  |  |  |  |  |
| 66 | 15 | [Download](66/dataset.zip) |  |  |  |  |  |  |  |  |
| 67 | 204 | [Download](67/dataset.zip) |  |  |  |  |  |  |  |  |
| 68 | 283 | [Download](68/dataset.zip) |  |  |  |  |  |  |  |  |
| 69 | 26 | [Download](69/dataset.zip) |  |  |  |  |  |  |  |  |
| 70 | 40 | [Download](70/dataset.zip) |  |  |  |  |  |  |  |  |
| 71 | 17 | [Download](71/dataset.zip) |  |  |  |  |  |  |  |  |
| 72 | 8 | [Download](72/dataset.zip) |  |  |  |  |  |  |  |  |
| 73 | 13 | [Download](73/dataset.zip) |  |  |  |  |  |  |  |  |
| 74 | 18 | [Download](74/dataset.zip) |  |  |  |  |  |  |  |  |
| 75 | 16 | [Download](75/dataset.zip) |  |  |  |  |  |  |  |  |
| 76 | 8 | [Download](76/dataset.zip) |  |  |  |  |  |  |  |  |
| 77 | 10 | [Download](77/dataset.zip) |  |  |  |  |  |  |  |  |
| 78 | 51 | [Download](78/dataset.zip) |  |  |  |  |  |  |  |  |
| 79 | 135 | [Download](79/dataset.zip) |  |  |  |  |  |  |  |  |
| 80 | 62 | [Download](80/dataset.zip) |  |  |  |  |  |  |  |  |
| 81 | 14 | [Download](81/dataset.zip) |  |  |  |  |  |  |  |  |
| 82 | 48 | [Download](82/dataset.zip) |  |  |  |  |  |  |  |  |
| 83 | 15 | [Download](83/dataset.zip) |  |  |  |  |  |  |  |  |
| 84 | 14 | [Download](84/dataset.zip) |  |  |  |  |  |  |  |  |
| 85 | 38 | [Download](85/dataset.zip) |  |  |  |  |  |  |  |  |
| 86 | 6 | [Download](86/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 87 | 14 | [Download](87/dataset.zip) |  |  |  |  |  |  |  |  |
| 88 | 8 | [Download](88/dataset.zip) |  |  |  |  |  |  |  |  |
| 89 | 9 | [Download](89/dataset.zip) |  |  |  |  |  |  |  |  |
| 90 | 6 | [Download](90/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 91 | 5 | [Download](91/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| 92 | 38 | [Download](92/dataset.zip) |  |  |  |  |  |  |  |  |
| 93 | 29 | [Download](93/dataset.zip) |  |  |  |  |  |  |  |  |
| 94 | 7 | [Download](94/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 95 | 17 | [Download](95/dataset.zip) |  |  |  |  |  |  |  |  |
| 96 | 24 | [Download](96/dataset.zip) |  |  |  |  |  |  |  |  |
| 97 | 11 | [Download](97/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 223 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
ppsanjay/dataerpici | 2023-09-18T11:49:44.000Z | [
"region:us"
] | ppsanjay | null | null | null | 0 | 0 | Entry not found |
marsexpress/none | 2023-09-15T10:16:47.000Z | [
"region:us"
] | marsexpress | null | null | null | 0 | 0 | Entry not found |
CyberHarem/harada_miyo_idolmastercinderellagirls | 2023-09-17T17:39:01.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of harada_miyo (THE iDOLM@STER: Cinderella Girls)
This is the dataset of harada_miyo (THE iDOLM@STER: Cinderella Girls), containing 64 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 64 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 172 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 64 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 64 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 64 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 64 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 64 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 172 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 172 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 172 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/yagami_makino_idolmastercinderellagirls | 2023-09-17T17:39:03.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of yagami_makino (THE iDOLM@STER: Cinderella Girls)
This is the dataset of yagami_makino (THE iDOLM@STER: Cinderella Girls), containing 161 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 161 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 405 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 161 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 161 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 161 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 161 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 161 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 405 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 405 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 405 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
ISOBIM/GeometricCommand | 2023-09-15T10:30:15.000Z | [
"license:other",
"region:us"
] | ISOBIM | null | null | null | 0 | 0 | ---
license: other
---
|
NickKolok/regs-nextphoto | 2023-09-16T15:03:23.000Z | [
"license:gpl-3.0",
"region:us"
] | NickKolok | null | null | null | 0 | 0 | ---
license: gpl-3.0
---
|
CyberHarem/okazaki_yasuha_idolmastercinderellagirls | 2023-09-17T17:39:05.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of okazaki_yasuha (THE iDOLM@STER: Cinderella Girls)
This is the dataset of okazaki_yasuha (THE iDOLM@STER: Cinderella Girls), containing 50 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 50 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 127 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 50 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 50 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 50 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 50 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 50 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 127 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 127 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 127 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_NewstaR__Morningstar-13b-hf | 2023-09-15T10:47:50.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of NewstaR/Morningstar-13b-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NewstaR/Morningstar-13b-hf](https://huggingface.co/NewstaR/Morningstar-13b-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NewstaR__Morningstar-13b-hf\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-15T10:46:30.957408](https://huggingface.co/datasets/open-llm-leaderboard/details_NewstaR__Morningstar-13b-hf/blob/main/results_2023-09-15T10-46-30.957408.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5478679055985619,\n\
\ \"acc_stderr\": 0.03451143209518346,\n \"acc_norm\": 0.551637824239685,\n\
\ \"acc_norm_stderr\": 0.03449240971096488,\n \"mc1\": 0.28518971848225216,\n\
\ \"mc1_stderr\": 0.015805827874454895,\n \"mc2\": 0.44118181192718914,\n\
\ \"mc2_stderr\": 0.01575597129997008\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5580204778156996,\n \"acc_stderr\": 0.014512682523128343,\n\
\ \"acc_norm\": 0.590443686006826,\n \"acc_norm_stderr\": 0.01437035863247244\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6292571200955985,\n\
\ \"acc_stderr\": 0.0048201660022530795,\n \"acc_norm\": 0.819259111730731,\n\
\ \"acc_norm_stderr\": 0.0038401692240122715\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5460526315789473,\n \"acc_stderr\": 0.04051646342874142,\n\
\ \"acc_norm\": 0.5460526315789473,\n \"acc_norm_stderr\": 0.04051646342874142\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5849056603773585,\n \"acc_stderr\": 0.03032594578928611,\n\
\ \"acc_norm\": 0.5849056603773585,\n \"acc_norm_stderr\": 0.03032594578928611\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.04122728707651282,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.04122728707651282\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4682080924855491,\n\
\ \"acc_stderr\": 0.03804749744364763,\n \"acc_norm\": 0.4682080924855491,\n\
\ \"acc_norm_stderr\": 0.03804749744364763\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03202563076101735,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.0416656757710158,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.0416656757710158\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3412698412698413,\n \"acc_stderr\": 0.024419234966819064,\n \"\
acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.024419234966819064\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n\
\ \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n\
\ \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6419354838709678,\n\
\ \"acc_stderr\": 0.02727389059430064,\n \"acc_norm\": 0.6419354838709678,\n\
\ \"acc_norm_stderr\": 0.02727389059430064\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.03646204963253811,\n\
\ \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.03646204963253811\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\"\
: 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.029519282616817234,\n\
\ \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.029519282616817234\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4948717948717949,\n \"acc_stderr\": 0.02534967290683866,\n \
\ \"acc_norm\": 0.4948717948717949,\n \"acc_norm_stderr\": 0.02534967290683866\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.02813325257881564,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.02813325257881564\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03242225027115007,\n \
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03242225027115007\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7321100917431193,\n \"acc_stderr\": 0.018987462257978652,\n \"\
acc_norm\": 0.7321100917431193,\n \"acc_norm_stderr\": 0.018987462257978652\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.03324708911809117,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.03324708911809117\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7401960784313726,\n \"acc_stderr\": 0.03077855467869326,\n \"\
acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.03077855467869326\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7172995780590717,\n \"acc_stderr\": 0.02931281415395592,\n \
\ \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.02931281415395592\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969637,\n\
\ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969637\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794089,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794089\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6503067484662577,\n \"acc_stderr\": 0.037466683254700206,\n\
\ \"acc_norm\": 0.6503067484662577,\n \"acc_norm_stderr\": 0.037466683254700206\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7863247863247863,\n\
\ \"acc_stderr\": 0.026853450377009175,\n \"acc_norm\": 0.7863247863247863,\n\
\ \"acc_norm_stderr\": 0.026853450377009175\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7471264367816092,\n\
\ \"acc_stderr\": 0.015543377313719681,\n \"acc_norm\": 0.7471264367816092,\n\
\ \"acc_norm_stderr\": 0.015543377313719681\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.026226158605124655,\n\
\ \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.026226158605124655\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3094972067039106,\n\
\ \"acc_stderr\": 0.015461169002371539,\n \"acc_norm\": 0.3094972067039106,\n\
\ \"acc_norm_stderr\": 0.015461169002371539\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5947712418300654,\n \"acc_stderr\": 0.028110928492809068,\n\
\ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.028110928492809068\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5884244372990354,\n\
\ \"acc_stderr\": 0.02795048149440127,\n \"acc_norm\": 0.5884244372990354,\n\
\ \"acc_norm_stderr\": 0.02795048149440127\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.02712511551316687,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.02712511551316687\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3900709219858156,\n \"acc_stderr\": 0.029097675599463926,\n \
\ \"acc_norm\": 0.3900709219858156,\n \"acc_norm_stderr\": 0.029097675599463926\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3917861799217731,\n\
\ \"acc_stderr\": 0.012467564418145121,\n \"acc_norm\": 0.3917861799217731,\n\
\ \"acc_norm_stderr\": 0.012467564418145121\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.030372836961539352,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.030372836961539352\n \
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\"\
: 0.5441176470588235,\n \"acc_stderr\": 0.02014893942041574,\n \"\
acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.02014893942041574\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.636734693877551,\n \"acc_stderr\": 0.030789051139030802,\n\
\ \"acc_norm\": 0.636734693877551,\n \"acc_norm_stderr\": 0.030789051139030802\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n\
\ \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.7512437810945274,\n\
\ \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7309941520467836,\n \"acc_stderr\": 0.03401052620104089,\n\
\ \"acc_norm\": 0.7309941520467836,\n \"acc_norm_stderr\": 0.03401052620104089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28518971848225216,\n\
\ \"mc1_stderr\": 0.015805827874454895,\n \"mc2\": 0.44118181192718914,\n\
\ \"mc2_stderr\": 0.01575597129997008\n }\n}\n```"
repo_url: https://huggingface.co/NewstaR/Morningstar-13b-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|arc:challenge|25_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hellaswag|10_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-15T10-46-30.957408.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T10-46-30.957408.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-15T10-46-30.957408.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-15T10-46-30.957408.parquet'
- config_name: results
data_files:
- split: 2023_09_15T10_46_30.957408
path:
- results_2023-09-15T10-46-30.957408.parquet
- split: latest
path:
- results_2023-09-15T10-46-30.957408.parquet
---
# Dataset Card for Evaluation run of NewstaR/Morningstar-13b-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NewstaR/Morningstar-13b-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NewstaR/Morningstar-13b-hf](https://huggingface.co/NewstaR/Morningstar-13b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NewstaR__Morningstar-13b-hf",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-15T10:46:30.957408](https://huggingface.co/datasets/open-llm-leaderboard/details_NewstaR__Morningstar-13b-hf/blob/main/results_2023-09-15T10-46-30.957408.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5478679055985619,
"acc_stderr": 0.03451143209518346,
"acc_norm": 0.551637824239685,
"acc_norm_stderr": 0.03449240971096488,
"mc1": 0.28518971848225216,
"mc1_stderr": 0.015805827874454895,
"mc2": 0.44118181192718914,
"mc2_stderr": 0.01575597129997008
},
"harness|arc:challenge|25": {
"acc": 0.5580204778156996,
"acc_stderr": 0.014512682523128343,
"acc_norm": 0.590443686006826,
"acc_norm_stderr": 0.01437035863247244
},
"harness|hellaswag|10": {
"acc": 0.6292571200955985,
"acc_stderr": 0.0048201660022530795,
"acc_norm": 0.819259111730731,
"acc_norm_stderr": 0.0038401692240122715
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5460526315789473,
"acc_stderr": 0.04051646342874142,
"acc_norm": 0.5460526315789473,
"acc_norm_stderr": 0.04051646342874142
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5849056603773585,
"acc_stderr": 0.03032594578928611,
"acc_norm": 0.5849056603773585,
"acc_norm_stderr": 0.03032594578928611
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04122728707651282,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04122728707651282
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4682080924855491,
"acc_stderr": 0.03804749744364763,
"acc_norm": 0.4682080924855491,
"acc_norm_stderr": 0.03804749744364763
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006717,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006717
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.0416656757710158,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.0416656757710158
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.024419234966819064,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.024419234966819064
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.04104947269903394,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.04104947269903394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6419354838709678,
"acc_stderr": 0.02727389059430064,
"acc_norm": 0.6419354838709678,
"acc_norm_stderr": 0.02727389059430064
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.03646204963253811,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.03646204963253811
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7875647668393783,
"acc_stderr": 0.029519282616817234,
"acc_norm": 0.7875647668393783,
"acc_norm_stderr": 0.029519282616817234
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4948717948717949,
"acc_stderr": 0.02534967290683866,
"acc_norm": 0.4948717948717949,
"acc_norm_stderr": 0.02534967290683866
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.02813325257881564,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.02813325257881564
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.03242225027115007,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.03242225027115007
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7321100917431193,
"acc_stderr": 0.018987462257978652,
"acc_norm": 0.7321100917431193,
"acc_norm_stderr": 0.018987462257978652
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.03324708911809117,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.03324708911809117
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.03077855467869326,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.03077855467869326
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7172995780590717,
"acc_stderr": 0.02931281415395592,
"acc_norm": 0.7172995780590717,
"acc_norm_stderr": 0.02931281415395592
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.04225875451969637,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.04225875451969637
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794089,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794089
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6503067484662577,
"acc_stderr": 0.037466683254700206,
"acc_norm": 0.6503067484662577,
"acc_norm_stderr": 0.037466683254700206
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7863247863247863,
"acc_stderr": 0.026853450377009175,
"acc_norm": 0.7863247863247863,
"acc_norm_stderr": 0.026853450377009175
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7471264367816092,
"acc_stderr": 0.015543377313719681,
"acc_norm": 0.7471264367816092,
"acc_norm_stderr": 0.015543377313719681
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.026226158605124655,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.026226158605124655
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3094972067039106,
"acc_stderr": 0.015461169002371539,
"acc_norm": 0.3094972067039106,
"acc_norm_stderr": 0.015461169002371539
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.028110928492809068,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.028110928492809068
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5884244372990354,
"acc_stderr": 0.02795048149440127,
"acc_norm": 0.5884244372990354,
"acc_norm_stderr": 0.02795048149440127
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.02712511551316687,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.02712511551316687
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3900709219858156,
"acc_stderr": 0.029097675599463926,
"acc_norm": 0.3900709219858156,
"acc_norm_stderr": 0.029097675599463926
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3917861799217731,
"acc_stderr": 0.012467564418145121,
"acc_norm": 0.3917861799217731,
"acc_norm_stderr": 0.012467564418145121
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5,
"acc_stderr": 0.030372836961539352,
"acc_norm": 0.5,
"acc_norm_stderr": 0.030372836961539352
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.02014893942041574,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.02014893942041574
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.636734693877551,
"acc_stderr": 0.030789051139030802,
"acc_norm": 0.636734693877551,
"acc_norm_stderr": 0.030789051139030802
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916714,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916714
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7309941520467836,
"acc_stderr": 0.03401052620104089,
"acc_norm": 0.7309941520467836,
"acc_norm_stderr": 0.03401052620104089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28518971848225216,
"mc1_stderr": 0.015805827874454895,
"mc2": 0.44118181192718914,
"mc2_stderr": 0.01575597129997008
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
HakaiNoRyuu/ds | 2023-09-15T10:49:23.000Z | [
"region:us"
] | HakaiNoRyuu | null | null | null | 0 | 0 | Entry not found |
maze/aigc | 2023-09-23T11:33:09.000Z | [
"size_categories:10K<n<100K",
"region:us"
] | maze | null | null | null | 0 | 0 | ---
size_categories:
- 10K<n<100K
---
Asian photography dataset
- [win3000](https://www.win3000.com/): about 18k asian celebrity photo.
- [jiepaigou](http://www.jiepaigou.com/): streetsnap and celebrity
- [cybesx](www.cybesx.com): about 13k street photography |
BBGAME605065444/1 | 2023-09-15T11:12:50.000Z | [
"region:us"
] | BBGAME605065444 | null | null | null | 0 | 0 | Entry not found |
Juneuarie/QingqueEN | 2023-09-15T11:14:56.000Z | [
"region:us"
] | Juneuarie | null | null | null | 0 | 0 | Entry not found |
tiendung/chai | 2023-09-15T22:31:13.000Z | [
"region:us"
] | tiendung | null | null | null | 0 | 0 | Entry not found |
CyberHarem/ayase_honoka_idolmastercinderellagirls | 2023-09-17T17:39:07.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ayase_honoka (THE iDOLM@STER: Cinderella Girls)
This is the dataset of ayase_honoka (THE iDOLM@STER: Cinderella Girls), containing 103 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 103 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 284 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 103 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 103 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 103 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 103 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 103 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 284 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 284 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 284 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
HaltiaAI/Her-The-Movie-Samantha-and-Theodore-Dataset | 2023-09-15T13:28:07.000Z | [
"license:other",
"Movie Dialog",
"Her The Movie",
"Dialogs from the Her Movie (2013)",
"region:us"
] | HaltiaAI | null | null | null | 1 | 0 | ---
license: other
tags:
- Movie Dialog
- Her The Movie
- Dialogs from the Her Movie (2013)
--- |
CyberHarem/zaizen_tokiko_idolmastercinderellagirls | 2023-09-17T17:39:09.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of zaizen_tokiko (THE iDOLM@STER: Cinderella Girls)
This is the dataset of zaizen_tokiko (THE iDOLM@STER: Cinderella Girls), containing 145 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 145 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 374 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 145 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 145 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 145 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 145 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 145 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 374 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 374 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 374 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/yao_feifei_idolmastercinderellagirls | 2023-09-17T17:39:11.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of yao_feifei (THE iDOLM@STER: Cinderella Girls)
This is the dataset of yao_feifei (THE iDOLM@STER: Cinderella Girls), containing 37 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 37 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 103 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 37 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 37 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 37 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 37 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 37 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 103 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 103 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 103 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/koganeikoito_edomaeelf | 2023-09-17T17:39:13.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of こがねいこいと
This is the dataset of こがねいこいと, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 631 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 300 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 300 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 631 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 631 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 631 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/kurokawa_chiaki_idolmastercinderellagirls | 2023-09-17T17:39:15.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kurokawa_chiaki (THE iDOLM@STER: Cinderella Girls)
This is the dataset of kurokawa_chiaki (THE iDOLM@STER: Cinderella Girls), containing 64 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 64 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 162 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 64 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 64 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 64 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 64 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 64 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 162 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 162 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 162 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
Coroseven/KaguyaShinomiya | 2023-09-17T12:42:16.000Z | [
"region:us"
] | Coroseven | null | null | null | 0 | 0 | Entry not found |
DemiseKing/sdcos | 2023-09-15T12:21:39.000Z | [
"license:openrail",
"region:us"
] | DemiseKing | null | null | null | 0 | 0 | ---
license: openrail
---
|
CyberHarem/eruda_edomaeelf | 2023-09-17T17:39:17.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of エルダ
This is the dataset of エルダ, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 667 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 300 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 300 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 667 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 667 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 667 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/seki_hiromi_idolmastercinderellagirls | 2023-09-17T17:39:19.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of seki_hiromi (THE iDOLM@STER: Cinderella Girls)
This is the dataset of seki_hiromi (THE iDOLM@STER: Cinderella Girls), containing 115 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 115 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 297 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 115 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 115 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 115 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 115 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 115 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 297 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 297 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 297 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/saejima_kiyomi_idolmastercinderellagirls | 2023-09-17T17:39:21.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of saejima_kiyomi (THE iDOLM@STER: Cinderella Girls)
This is the dataset of saejima_kiyomi (THE iDOLM@STER: Cinderella Girls), containing 45 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 45 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 110 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 45 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 45 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 45 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 45 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 45 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 110 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 110 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 110 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/matsuyama_kumiko_idolmastercinderellagirls | 2023-09-17T17:39:23.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of matsuyama_kumiko (THE iDOLM@STER: Cinderella Girls)
This is the dataset of matsuyama_kumiko (THE iDOLM@STER: Cinderella Girls), containing 35 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 35 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 90 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 35 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 35 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 35 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 35 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 35 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 90 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 90 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 90 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/sakurabakoma_edomaeelf | 2023-09-17T17:39:25.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of さくらばこま
This is the dataset of さくらばこま, containing 94 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 94 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 209 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 94 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 94 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 94 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 94 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 94 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 209 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 209 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 209 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_gpt2_public | 2023-09-15T12:47:36.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of gpt2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [gpt2](https://huggingface.co/gpt2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 1 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gpt2_public\"\
,\n\t\"harness_drop_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-09-15T12:47:31.231445](https://huggingface.co/datasets/open-llm-leaderboard/details_gpt2_public/blob/main/results_2023-09-15T12-47-31.231445.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0,\n \"\
em_stderr\": 0.0,\n \"f1\": 0.039,\n \"f1_stderr\": 0.028301943396169812\n\
\ },\n \"harness|drop|0\": {\n \"em\": 0.0,\n \"em_stderr\"\
: 0.0,\n \"f1\": 0.039,\n \"f1_stderr\": 0.028301943396169812\n \
\ }\n}\n```"
repo_url: https://huggingface.co/gpt2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_0
data_files:
- split: 2023_09_15T12_47_31.231445
path:
- '**/details_harness|drop|0_2023-09-15T12-47-31.231445.parquet'
- split: latest
path:
- '**/details_harness|drop|0_2023-09-15T12-47-31.231445.parquet'
- config_name: results
data_files:
- split: 2023_09_15T12_47_31.231445
path:
- results_2023-09-15T12-47-31.231445.parquet
- split: latest
path:
- results_2023-09-15T12-47-31.231445.parquet
---
# Dataset Card for Evaluation run of gpt2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/gpt2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [gpt2](https://huggingface.co/gpt2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 1 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_gpt2_public",
"harness_drop_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-15T12:47:31.231445](https://huggingface.co/datasets/open-llm-leaderboard/details_gpt2_public/blob/main/results_2023-09-15T12-47-31.231445.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 0.039,
"f1_stderr": 0.028301943396169812
},
"harness|drop|0": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 0.039,
"f1_stderr": 0.028301943396169812
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
VanoInvestigations/BOE-1988-2022 | 2023-09-15T12:53:41.000Z | [
"region:us"
] | VanoInvestigations | null | null | null | 0 | 0 | Entry not found |
CyberHarem/koganeikoyuzu_edomaeelf | 2023-09-17T17:39:27.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of こがねいこゆず
This is the dataset of こがねいこゆず, containing 80 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 80 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 182 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 80 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 80 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 80 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 80 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 80 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 182 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 182 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 182 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/nagatomi_hasumi_idolmastercinderellagirls | 2023-09-17T17:39:30.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of nagatomi_hasumi (THE iDOLM@STER: Cinderella Girls)
This is the dataset of nagatomi_hasumi (THE iDOLM@STER: Cinderella Girls), containing 42 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 42 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 111 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 42 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 42 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 42 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 42 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 42 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 111 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 111 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 111 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
bongo2112/diamondplatnumz-SDxl-output-images | 2023-09-15T18:45:34.000Z | [
"region:us"
] | bongo2112 | null | null | null | 0 | 0 | Entry not found |
CyberHarem/wakiyama_tamami_idolmastercinderellagirls | 2023-09-17T17:39:32.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of wakiyama_tamami (THE iDOLM@STER: Cinderella Girls)
This is the dataset of wakiyama_tamami (THE iDOLM@STER: Cinderella Girls), containing 43 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 43 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 119 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 43 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 43 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 43 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 43 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 43 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 119 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 119 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 119 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
Ifra12/llama2 | 2023-09-15T13:28:09.000Z | [
"region:us"
] | Ifra12 | null | null | null | 0 | 0 | Entry not found |
CyberHarem/tsuchiya_ako_idolmastercinderellagirls | 2023-09-17T17:39:34.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of tsuchiya_ako (THE iDOLM@STER: Cinderella Girls)
This is the dataset of tsuchiya_ako (THE iDOLM@STER: Cinderella Girls), containing 33 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 33 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 88 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 33 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 33 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 33 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 33 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 33 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 88 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 88 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 88 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
TuningAI/Cover_letter_v2 | 2023-09-15T13:40:41.000Z | [
"region:us"
] | TuningAI | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_elliotthwang__Elliott-Chinese-LLaMa-GPTQ-V1.0 | 2023-09-15T13:42:38.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of elliotthwang/Elliott-Chinese-LLaMa-GPTQ-V1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [elliotthwang/Elliott-Chinese-LLaMa-GPTQ-V1.0](https://huggingface.co/elliotthwang/Elliott-Chinese-LLaMa-GPTQ-V1.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_elliotthwang__Elliott-Chinese-LLaMa-GPTQ-V1.0\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-15T13:41:24.143261](https://huggingface.co/datasets/open-llm-leaderboard/details_elliotthwang__Elliott-Chinese-LLaMa-GPTQ-V1.0/blob/main/results_2023-09-15T13-41-24.143261.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.49399597506549714,\n\
\ \"acc_stderr\": 0.03511123221635479,\n \"acc_norm\": 0.4979736893285183,\n\
\ \"acc_norm_stderr\": 0.03510066112158077,\n \"mc1\": 0.29253365973072215,\n\
\ \"mc1_stderr\": 0.015925597445286165,\n \"mc2\": 0.44703431694949036,\n\
\ \"mc2_stderr\": 0.014683290152252474\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4667235494880546,\n \"acc_stderr\": 0.014578995859605813,\n\
\ \"acc_norm\": 0.5068259385665529,\n \"acc_norm_stderr\": 0.014610029151379813\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5590519816769568,\n\
\ \"acc_stderr\": 0.0049548591067816545,\n \"acc_norm\": 0.7536347341167098,\n\
\ \"acc_norm_stderr\": 0.004300131223340694\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5197368421052632,\n \"acc_stderr\": 0.04065771002562605,\n\
\ \"acc_norm\": 0.5197368421052632,\n \"acc_norm_stderr\": 0.04065771002562605\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5320754716981132,\n \"acc_stderr\": 0.03070948699255655,\n\
\ \"acc_norm\": 0.5320754716981132,\n \"acc_norm_stderr\": 0.03070948699255655\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5069444444444444,\n\
\ \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.5069444444444444,\n\
\ \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4393063583815029,\n\
\ \"acc_stderr\": 0.037842719328874674,\n \"acc_norm\": 0.4393063583815029,\n\
\ \"acc_norm_stderr\": 0.037842719328874674\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3702127659574468,\n \"acc_stderr\": 0.03156564682236784,\n\
\ \"acc_norm\": 0.3702127659574468,\n \"acc_norm_stderr\": 0.03156564682236784\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.04372748290278007,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.04372748290278007\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.29365079365079366,\n \"acc_stderr\": 0.023456037383982026,\n \"\
acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.023456037383982026\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5612903225806452,\n\
\ \"acc_stderr\": 0.028229497320317213,\n \"acc_norm\": 0.5612903225806452,\n\
\ \"acc_norm_stderr\": 0.028229497320317213\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3251231527093596,\n \"acc_stderr\": 0.032957975663112704,\n\
\ \"acc_norm\": 0.3251231527093596,\n \"acc_norm_stderr\": 0.032957975663112704\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.03713158067481913,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.03713158067481913\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6262626262626263,\n \"acc_stderr\": 0.03446897738659333,\n \"\
acc_norm\": 0.6262626262626263,\n \"acc_norm_stderr\": 0.03446897738659333\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7202072538860104,\n \"acc_stderr\": 0.03239637046735704,\n\
\ \"acc_norm\": 0.7202072538860104,\n \"acc_norm_stderr\": 0.03239637046735704\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4461538461538462,\n \"acc_stderr\": 0.02520357177302833,\n \
\ \"acc_norm\": 0.4461538461538462,\n \"acc_norm_stderr\": 0.02520357177302833\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24444444444444444,\n \"acc_stderr\": 0.02620276653465215,\n \
\ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.02620276653465215\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4327731092436975,\n \"acc_stderr\": 0.03218358107742613,\n \
\ \"acc_norm\": 0.4327731092436975,\n \"acc_norm_stderr\": 0.03218358107742613\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6788990825688074,\n \"acc_stderr\": 0.02001814977273375,\n \"\
acc_norm\": 0.6788990825688074,\n \"acc_norm_stderr\": 0.02001814977273375\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.35185185185185186,\n \"acc_stderr\": 0.03256850570293647,\n \"\
acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.03256850570293647\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6127450980392157,\n \"acc_stderr\": 0.03418931233833344,\n \"\
acc_norm\": 0.6127450980392157,\n \"acc_norm_stderr\": 0.03418931233833344\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.030685820596610795,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.030685820596610795\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5919282511210763,\n\
\ \"acc_stderr\": 0.03298574607842821,\n \"acc_norm\": 0.5919282511210763,\n\
\ \"acc_norm_stderr\": 0.03298574607842821\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.043457245702925335,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.043457245702925335\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04803752235190193,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04803752235190193\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5521472392638037,\n \"acc_stderr\": 0.03906947479456606,\n\
\ \"acc_norm\": 0.5521472392638037,\n \"acc_norm_stderr\": 0.03906947479456606\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n\
\ \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7521367521367521,\n\
\ \"acc_stderr\": 0.028286324075564393,\n \"acc_norm\": 0.7521367521367521,\n\
\ \"acc_norm_stderr\": 0.028286324075564393\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6896551724137931,\n\
\ \"acc_stderr\": 0.01654378502604831,\n \"acc_norm\": 0.6896551724137931,\n\
\ \"acc_norm_stderr\": 0.01654378502604831\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5028901734104047,\n \"acc_stderr\": 0.026918645383239004,\n\
\ \"acc_norm\": 0.5028901734104047,\n \"acc_norm_stderr\": 0.026918645383239004\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2770949720670391,\n\
\ \"acc_stderr\": 0.014968772435812145,\n \"acc_norm\": 0.2770949720670391,\n\
\ \"acc_norm_stderr\": 0.014968772435812145\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5620915032679739,\n \"acc_stderr\": 0.02840830202033269,\n\
\ \"acc_norm\": 0.5620915032679739,\n \"acc_norm_stderr\": 0.02840830202033269\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.572347266881029,\n\
\ \"acc_stderr\": 0.02809924077580956,\n \"acc_norm\": 0.572347266881029,\n\
\ \"acc_norm_stderr\": 0.02809924077580956\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5679012345679012,\n \"acc_stderr\": 0.027563010971606676,\n\
\ \"acc_norm\": 0.5679012345679012,\n \"acc_norm_stderr\": 0.027563010971606676\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3546099290780142,\n \"acc_stderr\": 0.02853865002887864,\n \
\ \"acc_norm\": 0.3546099290780142,\n \"acc_norm_stderr\": 0.02853865002887864\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.34159061277705344,\n\
\ \"acc_stderr\": 0.012112391320842849,\n \"acc_norm\": 0.34159061277705344,\n\
\ \"acc_norm_stderr\": 0.012112391320842849\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5110294117647058,\n \"acc_stderr\": 0.030365446477275675,\n\
\ \"acc_norm\": 0.5110294117647058,\n \"acc_norm_stderr\": 0.030365446477275675\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.47058823529411764,\n \"acc_stderr\": 0.02019280827143379,\n \
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.02019280827143379\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04789131426105757,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04789131426105757\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5142857142857142,\n \"acc_stderr\": 0.03199615232806286,\n\
\ \"acc_norm\": 0.5142857142857142,\n \"acc_norm_stderr\": 0.03199615232806286\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7014925373134329,\n\
\ \"acc_stderr\": 0.03235743789355041,\n \"acc_norm\": 0.7014925373134329,\n\
\ \"acc_norm_stderr\": 0.03235743789355041\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7485380116959064,\n \"acc_stderr\": 0.033275044238468436,\n\
\ \"acc_norm\": 0.7485380116959064,\n \"acc_norm_stderr\": 0.033275044238468436\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29253365973072215,\n\
\ \"mc1_stderr\": 0.015925597445286165,\n \"mc2\": 0.44703431694949036,\n\
\ \"mc2_stderr\": 0.014683290152252474\n }\n}\n```"
repo_url: https://huggingface.co/elliotthwang/Elliott-Chinese-LLaMa-GPTQ-V1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|arc:challenge|25_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hellaswag|10_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-15T13-41-24.143261.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T13-41-24.143261.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-15T13-41-24.143261.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-15T13-41-24.143261.parquet'
- config_name: results
data_files:
- split: 2023_09_15T13_41_24.143261
path:
- results_2023-09-15T13-41-24.143261.parquet
- split: latest
path:
- results_2023-09-15T13-41-24.143261.parquet
---
# Dataset Card for Evaluation run of elliotthwang/Elliott-Chinese-LLaMa-GPTQ-V1.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/elliotthwang/Elliott-Chinese-LLaMa-GPTQ-V1.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [elliotthwang/Elliott-Chinese-LLaMa-GPTQ-V1.0](https://huggingface.co/elliotthwang/Elliott-Chinese-LLaMa-GPTQ-V1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_elliotthwang__Elliott-Chinese-LLaMa-GPTQ-V1.0",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-15T13:41:24.143261](https://huggingface.co/datasets/open-llm-leaderboard/details_elliotthwang__Elliott-Chinese-LLaMa-GPTQ-V1.0/blob/main/results_2023-09-15T13-41-24.143261.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.49399597506549714,
"acc_stderr": 0.03511123221635479,
"acc_norm": 0.4979736893285183,
"acc_norm_stderr": 0.03510066112158077,
"mc1": 0.29253365973072215,
"mc1_stderr": 0.015925597445286165,
"mc2": 0.44703431694949036,
"mc2_stderr": 0.014683290152252474
},
"harness|arc:challenge|25": {
"acc": 0.4667235494880546,
"acc_stderr": 0.014578995859605813,
"acc_norm": 0.5068259385665529,
"acc_norm_stderr": 0.014610029151379813
},
"harness|hellaswag|10": {
"acc": 0.5590519816769568,
"acc_stderr": 0.0049548591067816545,
"acc_norm": 0.7536347341167098,
"acc_norm_stderr": 0.004300131223340694
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5197368421052632,
"acc_stderr": 0.04065771002562605,
"acc_norm": 0.5197368421052632,
"acc_norm_stderr": 0.04065771002562605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5320754716981132,
"acc_stderr": 0.03070948699255655,
"acc_norm": 0.5320754716981132,
"acc_norm_stderr": 0.03070948699255655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5069444444444444,
"acc_stderr": 0.04180806750294938,
"acc_norm": 0.5069444444444444,
"acc_norm_stderr": 0.04180806750294938
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4393063583815029,
"acc_stderr": 0.037842719328874674,
"acc_norm": 0.4393063583815029,
"acc_norm_stderr": 0.037842719328874674
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3702127659574468,
"acc_stderr": 0.03156564682236784,
"acc_norm": 0.3702127659574468,
"acc_norm_stderr": 0.03156564682236784
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.04372748290278007,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.04372748290278007
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.023456037383982026,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.023456037383982026
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5612903225806452,
"acc_stderr": 0.028229497320317213,
"acc_norm": 0.5612903225806452,
"acc_norm_stderr": 0.028229497320317213
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3251231527093596,
"acc_stderr": 0.032957975663112704,
"acc_norm": 0.3251231527093596,
"acc_norm_stderr": 0.032957975663112704
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.03713158067481913,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.03713158067481913
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6262626262626263,
"acc_stderr": 0.03446897738659333,
"acc_norm": 0.6262626262626263,
"acc_norm_stderr": 0.03446897738659333
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7202072538860104,
"acc_stderr": 0.03239637046735704,
"acc_norm": 0.7202072538860104,
"acc_norm_stderr": 0.03239637046735704
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4461538461538462,
"acc_stderr": 0.02520357177302833,
"acc_norm": 0.4461538461538462,
"acc_norm_stderr": 0.02520357177302833
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.02620276653465215,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.02620276653465215
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4327731092436975,
"acc_stderr": 0.03218358107742613,
"acc_norm": 0.4327731092436975,
"acc_norm_stderr": 0.03218358107742613
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6788990825688074,
"acc_stderr": 0.02001814977273375,
"acc_norm": 0.6788990825688074,
"acc_norm_stderr": 0.02001814977273375
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.03256850570293647,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.03256850570293647
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6127450980392157,
"acc_stderr": 0.03418931233833344,
"acc_norm": 0.6127450980392157,
"acc_norm_stderr": 0.03418931233833344
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.030685820596610795,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.030685820596610795
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5919282511210763,
"acc_stderr": 0.03298574607842821,
"acc_norm": 0.5919282511210763,
"acc_norm_stderr": 0.03298574607842821
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.043457245702925335,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.043457245702925335
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04803752235190193,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04803752235190193
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5521472392638037,
"acc_stderr": 0.03906947479456606,
"acc_norm": 0.5521472392638037,
"acc_norm_stderr": 0.03906947479456606
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7521367521367521,
"acc_stderr": 0.028286324075564393,
"acc_norm": 0.7521367521367521,
"acc_norm_stderr": 0.028286324075564393
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6896551724137931,
"acc_stderr": 0.01654378502604831,
"acc_norm": 0.6896551724137931,
"acc_norm_stderr": 0.01654378502604831
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5028901734104047,
"acc_stderr": 0.026918645383239004,
"acc_norm": 0.5028901734104047,
"acc_norm_stderr": 0.026918645383239004
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2770949720670391,
"acc_stderr": 0.014968772435812145,
"acc_norm": 0.2770949720670391,
"acc_norm_stderr": 0.014968772435812145
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5620915032679739,
"acc_stderr": 0.02840830202033269,
"acc_norm": 0.5620915032679739,
"acc_norm_stderr": 0.02840830202033269
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.572347266881029,
"acc_stderr": 0.02809924077580956,
"acc_norm": 0.572347266881029,
"acc_norm_stderr": 0.02809924077580956
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5679012345679012,
"acc_stderr": 0.027563010971606676,
"acc_norm": 0.5679012345679012,
"acc_norm_stderr": 0.027563010971606676
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3546099290780142,
"acc_stderr": 0.02853865002887864,
"acc_norm": 0.3546099290780142,
"acc_norm_stderr": 0.02853865002887864
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.34159061277705344,
"acc_stderr": 0.012112391320842849,
"acc_norm": 0.34159061277705344,
"acc_norm_stderr": 0.012112391320842849
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5110294117647058,
"acc_stderr": 0.030365446477275675,
"acc_norm": 0.5110294117647058,
"acc_norm_stderr": 0.030365446477275675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.02019280827143379,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.02019280827143379
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5,
"acc_stderr": 0.04789131426105757,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04789131426105757
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5142857142857142,
"acc_stderr": 0.03199615232806286,
"acc_norm": 0.5142857142857142,
"acc_norm_stderr": 0.03199615232806286
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7014925373134329,
"acc_stderr": 0.03235743789355041,
"acc_norm": 0.7014925373134329,
"acc_norm_stderr": 0.03235743789355041
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7485380116959064,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.7485380116959064,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29253365973072215,
"mc1_stderr": 0.015925597445286165,
"mc2": 0.44703431694949036,
"mc2_stderr": 0.014683290152252474
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/wakui_rumi_idolmastercinderellagirls | 2023-09-17T17:39:36.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of wakui_rumi (THE iDOLM@STER: Cinderella Girls)
This is the dataset of wakui_rumi (THE iDOLM@STER: Cinderella Girls), containing 35 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 35 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 96 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 35 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 35 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 35 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 35 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 35 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 96 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 96 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 96 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
Krinx/edm.txt | 2023-09-15T13:58:31.000Z | [
"region:us"
] | Krinx | null | null | null | 0 | 0 | Entry not found |
CyberHarem/kitami_yuzu_idolmastercinderellagirls | 2023-09-17T17:39:38.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kitami_yuzu (THE iDOLM@STER: Cinderella Girls)
This is the dataset of kitami_yuzu (THE iDOLM@STER: Cinderella Girls), containing 156 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 156 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 422 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 156 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 156 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 156 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 156 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 156 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 422 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 422 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 422 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/ebihara_naho_idolmastercinderellagirls | 2023-09-17T17:39:40.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ebihara_naho (THE iDOLM@STER: Cinderella Girls)
This is the dataset of ebihara_naho (THE iDOLM@STER: Cinderella Girls), containing 78 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 78 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 222 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 78 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 78 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 78 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 78 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 78 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 222 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 222 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 222 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/kudou_shinobu_idolmastercinderellagirls | 2023-09-17T17:39:42.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kudou_shinobu (THE iDOLM@STER: Cinderella Girls)
This is the dataset of kudou_shinobu (THE iDOLM@STER: Cinderella Girls), containing 30 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 30 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 81 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 30 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 30 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 30 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 30 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 30 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 81 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 81 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 81 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/matsumoto_sarina_idolmastercinderellagirls | 2023-09-17T17:39:44.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of matsumoto_sarina (THE iDOLM@STER: Cinderella Girls)
This is the dataset of matsumoto_sarina (THE iDOLM@STER: Cinderella Girls), containing 69 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 69 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 180 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 69 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 69 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 69 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 69 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 69 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 180 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 180 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 180 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
rkparalian/autotrain-data-12 | 2023-09-15T14:49:42.000Z | [
"region:us"
] | rkparalian | null | null | null | 0 | 0 | Entry not found |
CyberHarem/namiki_meiko_idolmastercinderellagirls | 2023-09-17T17:39:46.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of namiki_meiko (THE iDOLM@STER: Cinderella Girls)
This is the dataset of namiki_meiko (THE iDOLM@STER: Cinderella Girls), containing 34 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 34 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 91 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 34 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 34 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 34 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 34 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 34 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 91 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 91 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 91 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
dabdib/collections | 2023-09-21T14:38:25.000Z | [
"region:us"
] | dabdib | null | null | null | 0 | 0 | Entry not found |
CyberHarem/fujii_tomo_idolmastercinderellagirls | 2023-09-17T17:39:48.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of fujii_tomo (THE iDOLM@STER: Cinderella Girls)
This is the dataset of fujii_tomo (THE iDOLM@STER: Cinderella Girls), containing 50 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 50 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 127 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 50 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 50 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 50 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 50 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 50 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 127 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 127 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 127 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/nishikawa_honami_idolmastercinderellagirls | 2023-09-17T17:39:50.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of nishikawa_honami (THE iDOLM@STER: Cinderella Girls)
This is the dataset of nishikawa_honami (THE iDOLM@STER: Cinderella Girls), containing 25 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 25 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 63 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 25 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 25 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 25 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 25 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 25 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 63 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 63 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 63 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/eve_santaclaus_idolmastercinderellagirls | 2023-09-17T17:39:52.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of eve_santaclaus (THE iDOLM@STER: Cinderella Girls)
This is the dataset of eve_santaclaus (THE iDOLM@STER: Cinderella Girls), containing 98 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 98 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 269 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 98 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 98 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 98 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 98 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 98 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 269 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 269 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 269 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/hiiragi_shino_idolmastercinderellagirls | 2023-09-17T17:39:54.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hiiragi_shino (THE iDOLM@STER: Cinderella Girls)
This is the dataset of hiiragi_shino (THE iDOLM@STER: Cinderella Girls), containing 42 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 42 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 112 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 42 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 42 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 42 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 42 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 42 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 112 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 112 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 112 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
anton-l/swh_missing_repos | 2023-09-15T15:46:10.000Z | [
"region:us"
] | anton-l | null | null | null | 0 | 0 | Entry not found |
fffiloni/new_dataset | 2023-09-15T15:46:59.000Z | [
"region:us"
] | fffiloni | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 848679.0
num_examples: 2
download_size: 848790
dataset_size: 848679.0
---
# Dataset Card for "new_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/saitou_youko_idolmastercinderellagirls | 2023-09-17T17:39:57.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of saitou_youko (THE iDOLM@STER: Cinderella Girls)
This is the dataset of saitou_youko (THE iDOLM@STER: Cinderella Girls), containing 22 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 22 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 61 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 22 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 22 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 22 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 22 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 22 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 61 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 61 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 61 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/kitagawa_mahiro_idolmastercinderellagirls | 2023-09-17T17:39:59.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kitagawa_mahiro (THE iDOLM@STER: Cinderella Girls)
This is the dataset of kitagawa_mahiro (THE iDOLM@STER: Cinderella Girls), containing 22 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 22 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 60 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 22 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 22 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 22 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 22 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 22 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 60 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 60 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 60 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
shijli/wmt14-deen | 2023-09-16T15:44:08.000Z | [
"region:us"
] | shijli | null | null | null | 0 | 0 | # WMT 2014 German-English Translation Dataset
This dataset was built with the fairseq's processing script, which can be original
found [here](https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2de.sh)
You can create this dataset by simply run:
```commandline
git clone https://huggingface.co/datasets/shijli/wmt14-deen
cd wmt14-deen/data
bash prepare-wmt14.sh
```
`binarized.dist.de-en.zip` and `binarized.dist.en-de.zip` are distilled datasets generated by a transformer base model.
It can be built by running:
```commandline
bash prepare-wmt14-distill.sh /path/to/fairseq/model source-lang target-lang
```
To build this dataset, you need to create `binarized.zip` first. Note that the distilled dataset only uses
model-generated
target sentences, which means that different translation directions result in different datasets. Therefore, you need to
specify `source-lang` and `target-lang` explicitly. Also, you need to replace `/path/to/fairseq/model` with the path of
your pretrained model. |
open-llm-leaderboard/details_oh-yeontaek__llama-2-70B-LoRA-assemble-v2 | 2023-09-15T16:07:38.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of oh-yeontaek/llama-2-70B-LoRA-assemble-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [oh-yeontaek/llama-2-70B-LoRA-assemble-v2](https://huggingface.co/oh-yeontaek/llama-2-70B-LoRA-assemble-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_oh-yeontaek__llama-2-70B-LoRA-assemble-v2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-15T16:06:18.387785](https://huggingface.co/datasets/open-llm-leaderboard/details_oh-yeontaek__llama-2-70B-LoRA-assemble-v2/blob/main/results_2023-09-15T16-06-18.387785.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6931761249212157,\n\
\ \"acc_stderr\": 0.031300161246260914,\n \"acc_norm\": 0.6971025131327819,\n\
\ \"acc_norm_stderr\": 0.03127024725201448,\n \"mc1\": 0.4663402692778458,\n\
\ \"mc1_stderr\": 0.01746379386716811,\n \"mc2\": 0.6478807414957388,\n\
\ \"mc2_stderr\": 0.014914964973799093\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6834470989761092,\n \"acc_stderr\": 0.013592431519068079,\n\
\ \"acc_norm\": 0.7184300341296929,\n \"acc_norm_stderr\": 0.013143376735009022\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6721768571997611,\n\
\ \"acc_stderr\": 0.0046846063106423304,\n \"acc_norm\": 0.8688508265285799,\n\
\ \"acc_norm_stderr\": 0.00336873543416138\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7828947368421053,\n \"acc_stderr\": 0.03355045304882924,\n\
\ \"acc_norm\": 0.7828947368421053,\n \"acc_norm_stderr\": 0.03355045304882924\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7433962264150943,\n \"acc_stderr\": 0.02688064788905199,\n\
\ \"acc_norm\": 0.7433962264150943,\n \"acc_norm_stderr\": 0.02688064788905199\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n\
\ \"acc_stderr\": 0.03216600808802267,\n \"acc_norm\": 0.8194444444444444,\n\
\ \"acc_norm_stderr\": 0.03216600808802267\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6553191489361702,\n \"acc_stderr\": 0.031068985963122145,\n\
\ \"acc_norm\": 0.6553191489361702,\n \"acc_norm_stderr\": 0.031068985963122145\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\
\ \"acc_stderr\": 0.04630653203366595,\n \"acc_norm\": 0.41228070175438597,\n\
\ \"acc_norm_stderr\": 0.04630653203366595\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.04013124195424386,\n\
\ \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.04013124195424386\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4656084656084656,\n \"acc_stderr\": 0.02569032176249384,\n \"\
acc_norm\": 0.4656084656084656,\n \"acc_norm_stderr\": 0.02569032176249384\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.832258064516129,\n\
\ \"acc_stderr\": 0.021255464065371314,\n \"acc_norm\": 0.832258064516129,\n\
\ \"acc_norm_stderr\": 0.021255464065371314\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5467980295566502,\n \"acc_stderr\": 0.03502544650845872,\n\
\ \"acc_norm\": 0.5467980295566502,\n \"acc_norm_stderr\": 0.03502544650845872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\"\
: 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8303030303030303,\n \"acc_stderr\": 0.02931118867498312,\n\
\ \"acc_norm\": 0.8303030303030303,\n \"acc_norm_stderr\": 0.02931118867498312\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9040404040404041,\n \"acc_stderr\": 0.020984808610047933,\n \"\
acc_norm\": 0.9040404040404041,\n \"acc_norm_stderr\": 0.020984808610047933\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.018718998520678178,\n\
\ \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.018718998520678178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6974358974358974,\n \"acc_stderr\": 0.02329088805377272,\n \
\ \"acc_norm\": 0.6974358974358974,\n \"acc_norm_stderr\": 0.02329088805377272\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473072,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473072\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7815126050420168,\n \"acc_stderr\": 0.026841514322958934,\n\
\ \"acc_norm\": 0.7815126050420168,\n \"acc_norm_stderr\": 0.026841514322958934\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"\
acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8844036697247707,\n \"acc_stderr\": 0.013708749534172636,\n \"\
acc_norm\": 0.8844036697247707,\n \"acc_norm_stderr\": 0.013708749534172636\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5648148148148148,\n \"acc_stderr\": 0.033812000056435254,\n \"\
acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.033812000056435254\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8970588235294118,\n \"acc_stderr\": 0.021328337570804365,\n \"\
acc_norm\": 0.8970588235294118,\n \"acc_norm_stderr\": 0.021328337570804365\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8776371308016878,\n \"acc_stderr\": 0.02133174182974679,\n \
\ \"acc_norm\": 0.8776371308016878,\n \"acc_norm_stderr\": 0.02133174182974679\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7668161434977578,\n\
\ \"acc_stderr\": 0.028380391147094706,\n \"acc_norm\": 0.7668161434977578,\n\
\ \"acc_norm_stderr\": 0.028380391147094706\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8244274809160306,\n \"acc_stderr\": 0.03336820338476073,\n\
\ \"acc_norm\": 0.8244274809160306,\n \"acc_norm_stderr\": 0.03336820338476073\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8512396694214877,\n \"acc_stderr\": 0.03248470083807194,\n \"\
acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.03248470083807194\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.03602814176392645,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.03602814176392645\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8220858895705522,\n \"acc_stderr\": 0.03004735765580663,\n\
\ \"acc_norm\": 0.8220858895705522,\n \"acc_norm_stderr\": 0.03004735765580663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.0376017800602662,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.0376017800602662\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.02058849131609238,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.02058849131609238\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8659003831417624,\n\
\ \"acc_stderr\": 0.012185528166499983,\n \"acc_norm\": 0.8659003831417624,\n\
\ \"acc_norm_stderr\": 0.012185528166499983\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7774566473988439,\n \"acc_stderr\": 0.02239421566194282,\n\
\ \"acc_norm\": 0.7774566473988439,\n \"acc_norm_stderr\": 0.02239421566194282\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5597765363128492,\n\
\ \"acc_stderr\": 0.01660256461504993,\n \"acc_norm\": 0.5597765363128492,\n\
\ \"acc_norm_stderr\": 0.01660256461504993\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02526169121972949,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02526169121972949\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7588424437299035,\n\
\ \"acc_stderr\": 0.024296594034763426,\n \"acc_norm\": 0.7588424437299035,\n\
\ \"acc_norm_stderr\": 0.024296594034763426\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.022409674547304168,\n\
\ \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.022409674547304168\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5531914893617021,\n \"acc_stderr\": 0.02965823509766691,\n \
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.02965823509766691\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5586701434159062,\n\
\ \"acc_stderr\": 0.012682016335646678,\n \"acc_norm\": 0.5586701434159062,\n\
\ \"acc_norm_stderr\": 0.012682016335646678\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.026799562024887653,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.026799562024887653\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7434640522875817,\n \"acc_stderr\": 0.017667841612379005,\n \
\ \"acc_norm\": 0.7434640522875817,\n \"acc_norm_stderr\": 0.017667841612379005\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n\
\ \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.7454545454545455,\n\
\ \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7795918367346939,\n \"acc_stderr\": 0.026537045312145298,\n\
\ \"acc_norm\": 0.7795918367346939,\n \"acc_norm_stderr\": 0.026537045312145298\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n\
\ \"acc_stderr\": 0.022509345325101706,\n \"acc_norm\": 0.8855721393034826,\n\
\ \"acc_norm_stderr\": 0.022509345325101706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.0266405825391332,\n\
\ \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.0266405825391332\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4663402692778458,\n\
\ \"mc1_stderr\": 0.01746379386716811,\n \"mc2\": 0.6478807414957388,\n\
\ \"mc2_stderr\": 0.014914964973799093\n }\n}\n```"
repo_url: https://huggingface.co/oh-yeontaek/llama-2-70B-LoRA-assemble-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|arc:challenge|25_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hellaswag|10_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-15T16-06-18.387785.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T16-06-18.387785.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-15T16-06-18.387785.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-15T16-06-18.387785.parquet'
- config_name: results
data_files:
- split: 2023_09_15T16_06_18.387785
path:
- results_2023-09-15T16-06-18.387785.parquet
- split: latest
path:
- results_2023-09-15T16-06-18.387785.parquet
---
# Dataset Card for Evaluation run of oh-yeontaek/llama-2-70B-LoRA-assemble-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/oh-yeontaek/llama-2-70B-LoRA-assemble-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [oh-yeontaek/llama-2-70B-LoRA-assemble-v2](https://huggingface.co/oh-yeontaek/llama-2-70B-LoRA-assemble-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_oh-yeontaek__llama-2-70B-LoRA-assemble-v2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-15T16:06:18.387785](https://huggingface.co/datasets/open-llm-leaderboard/details_oh-yeontaek__llama-2-70B-LoRA-assemble-v2/blob/main/results_2023-09-15T16-06-18.387785.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6931761249212157,
"acc_stderr": 0.031300161246260914,
"acc_norm": 0.6971025131327819,
"acc_norm_stderr": 0.03127024725201448,
"mc1": 0.4663402692778458,
"mc1_stderr": 0.01746379386716811,
"mc2": 0.6478807414957388,
"mc2_stderr": 0.014914964973799093
},
"harness|arc:challenge|25": {
"acc": 0.6834470989761092,
"acc_stderr": 0.013592431519068079,
"acc_norm": 0.7184300341296929,
"acc_norm_stderr": 0.013143376735009022
},
"harness|hellaswag|10": {
"acc": 0.6721768571997611,
"acc_stderr": 0.0046846063106423304,
"acc_norm": 0.8688508265285799,
"acc_norm_stderr": 0.00336873543416138
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7828947368421053,
"acc_stderr": 0.03355045304882924,
"acc_norm": 0.7828947368421053,
"acc_norm_stderr": 0.03355045304882924
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7433962264150943,
"acc_stderr": 0.02688064788905199,
"acc_norm": 0.7433962264150943,
"acc_norm_stderr": 0.02688064788905199
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.03216600808802267,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.03216600808802267
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105654,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6553191489361702,
"acc_stderr": 0.031068985963122145,
"acc_norm": 0.6553191489361702,
"acc_norm_stderr": 0.031068985963122145
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.04630653203366595,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.04630653203366595
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.04013124195424386,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.04013124195424386
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4656084656084656,
"acc_stderr": 0.02569032176249384,
"acc_norm": 0.4656084656084656,
"acc_norm_stderr": 0.02569032176249384
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.832258064516129,
"acc_stderr": 0.021255464065371314,
"acc_norm": 0.832258064516129,
"acc_norm_stderr": 0.021255464065371314
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5467980295566502,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.5467980295566502,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8303030303030303,
"acc_stderr": 0.02931118867498312,
"acc_norm": 0.8303030303030303,
"acc_norm_stderr": 0.02931118867498312
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9040404040404041,
"acc_stderr": 0.020984808610047933,
"acc_norm": 0.9040404040404041,
"acc_norm_stderr": 0.020984808610047933
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.018718998520678178,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.018718998520678178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6974358974358974,
"acc_stderr": 0.02329088805377272,
"acc_norm": 0.6974358974358974,
"acc_norm_stderr": 0.02329088805377272
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473072,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473072
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7815126050420168,
"acc_stderr": 0.026841514322958934,
"acc_norm": 0.7815126050420168,
"acc_norm_stderr": 0.026841514322958934
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8844036697247707,
"acc_stderr": 0.013708749534172636,
"acc_norm": 0.8844036697247707,
"acc_norm_stderr": 0.013708749534172636
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.033812000056435254,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.033812000056435254
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8970588235294118,
"acc_stderr": 0.021328337570804365,
"acc_norm": 0.8970588235294118,
"acc_norm_stderr": 0.021328337570804365
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8776371308016878,
"acc_stderr": 0.02133174182974679,
"acc_norm": 0.8776371308016878,
"acc_norm_stderr": 0.02133174182974679
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7668161434977578,
"acc_stderr": 0.028380391147094706,
"acc_norm": 0.7668161434977578,
"acc_norm_stderr": 0.028380391147094706
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8244274809160306,
"acc_stderr": 0.03336820338476073,
"acc_norm": 0.8244274809160306,
"acc_norm_stderr": 0.03336820338476073
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8512396694214877,
"acc_stderr": 0.03248470083807194,
"acc_norm": 0.8512396694214877,
"acc_norm_stderr": 0.03248470083807194
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.03602814176392645,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.03602814176392645
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8220858895705522,
"acc_stderr": 0.03004735765580663,
"acc_norm": 0.8220858895705522,
"acc_norm_stderr": 0.03004735765580663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.0376017800602662,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.0376017800602662
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.02058849131609238,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.02058849131609238
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8659003831417624,
"acc_stderr": 0.012185528166499983,
"acc_norm": 0.8659003831417624,
"acc_norm_stderr": 0.012185528166499983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7774566473988439,
"acc_stderr": 0.02239421566194282,
"acc_norm": 0.7774566473988439,
"acc_norm_stderr": 0.02239421566194282
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5597765363128492,
"acc_stderr": 0.01660256461504993,
"acc_norm": 0.5597765363128492,
"acc_norm_stderr": 0.01660256461504993
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02526169121972949,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02526169121972949
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7588424437299035,
"acc_stderr": 0.024296594034763426,
"acc_norm": 0.7588424437299035,
"acc_norm_stderr": 0.024296594034763426
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.022409674547304168,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.022409674547304168
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.02965823509766691,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.02965823509766691
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5586701434159062,
"acc_stderr": 0.012682016335646678,
"acc_norm": 0.5586701434159062,
"acc_norm_stderr": 0.012682016335646678
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.026799562024887653,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.026799562024887653
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7434640522875817,
"acc_stderr": 0.017667841612379005,
"acc_norm": 0.7434640522875817,
"acc_norm_stderr": 0.017667841612379005
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.041723430387053825,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.041723430387053825
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7795918367346939,
"acc_stderr": 0.026537045312145298,
"acc_norm": 0.7795918367346939,
"acc_norm_stderr": 0.026537045312145298
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101706,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.0266405825391332,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.0266405825391332
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4663402692778458,
"mc1_stderr": 0.01746379386716811,
"mc2": 0.6478807414957388,
"mc2_stderr": 0.014914964973799093
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/senzaki_ema_idolmastercinderellagirls | 2023-09-17T17:40:01.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of senzaki_ema (THE iDOLM@STER: Cinderella Girls)
This is the dataset of senzaki_ema (THE iDOLM@STER: Cinderella Girls), containing 43 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 43 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 118 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 43 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 43 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 43 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 43 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 43 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 118 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 118 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 118 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/yamato_aki_idolmastercinderellagirls | 2023-09-17T17:40:03.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of yamato_aki (THE iDOLM@STER: Cinderella Girls)
This is the dataset of yamato_aki (THE iDOLM@STER: Cinderella Girls), containing 108 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 108 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 286 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 108 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 108 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 108 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 108 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 108 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 286 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 286 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 286 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/hyoudou_rena_idolmastercinderellagirls | 2023-09-17T17:40:05.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hyoudou_rena (THE iDOLM@STER: Cinderella Girls)
This is the dataset of hyoudou_rena (THE iDOLM@STER: Cinderella Girls), containing 42 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 42 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 118 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 42 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 42 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 42 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 42 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 42 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 118 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 118 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 118 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/aozaki_touko_karanokyoukai | 2023-09-17T17:40:07.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Aozaki Touko
This is the dataset of Aozaki Touko, containing 156 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 156 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 338 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 156 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 156 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 156 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 156 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 156 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 338 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 338 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 338 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/yoshioka_saki_idolmastercinderellagirls | 2023-09-17T17:40:09.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of yoshioka_saki (THE iDOLM@STER: Cinderella Girls)
This is the dataset of yoshioka_saki (THE iDOLM@STER: Cinderella Girls), containing 37 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 37 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 95 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 37 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 37 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 37 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 37 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 37 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 95 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 95 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 95 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/matsubara_saya_idolmastercinderellagirls | 2023-09-17T17:40:12.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of matsubara_saya (THE iDOLM@STER: Cinderella Girls)
This is the dataset of matsubara_saya (THE iDOLM@STER: Cinderella Girls), containing 19 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 19 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 47 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 19 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 19 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 19 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 19 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 19 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 47 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 47 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 47 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
Zerenidel/mayjjkbl | 2023-09-15T16:58:23.000Z | [
"region:us"
] | Zerenidel | null | null | null | 0 | 0 | Entry not found |
CyberHarem/nanjou_hikaru_idolmastercinderellagirls | 2023-09-17T17:40:15.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of nanjou_hikaru (THE iDOLM@STER: Cinderella Girls)
This is the dataset of nanjou_hikaru (THE iDOLM@STER: Cinderella Girls), containing 71 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 71 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 195 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 71 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 71 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 71 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 71 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 71 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 195 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 195 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 195 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
KyonBS/data | 2023-09-15T17:19:25.000Z | [
"region:us"
] | KyonBS | null | null | null | 0 | 0 | Entry not found |
kuronomiki/anjeg | 2023-09-15T17:34:26.000Z | [
"license:other",
"region:us"
] | kuronomiki | null | null | null | 0 | 0 | ---
license: other
---
|
JOSEDURANisc/vit-model | 2023-09-15T17:33:46.000Z | [
"region:us"
] | JOSEDURANisc | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_oh-yeontaek__llama-2-70B-LoRA-assemble-v3 | 2023-09-15T17:37:50.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of oh-yeontaek/llama-2-70B-LoRA-assemble-v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [oh-yeontaek/llama-2-70B-LoRA-assemble-v3](https://huggingface.co/oh-yeontaek/llama-2-70B-LoRA-assemble-v3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_oh-yeontaek__llama-2-70B-LoRA-assemble-v3\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-15T17:36:30.757691](https://huggingface.co/datasets/open-llm-leaderboard/details_oh-yeontaek__llama-2-70B-LoRA-assemble-v3/blob/main/results_2023-09-15T17-36-30.757691.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6985803552112708,\n\
\ \"acc_stderr\": 0.03118492094070661,\n \"acc_norm\": 0.7024274155828159,\n\
\ \"acc_norm_stderr\": 0.031154550420018332,\n \"mc1\": 0.47980416156670747,\n\
\ \"mc1_stderr\": 0.01748921684973705,\n \"mc2\": 0.658093697491632,\n\
\ \"mc2_stderr\": 0.014747866760131165\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6860068259385665,\n \"acc_stderr\": 0.013562691224726291,\n\
\ \"acc_norm\": 0.7209897610921502,\n \"acc_norm_stderr\": 0.013106784883601334\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6820354511053575,\n\
\ \"acc_stderr\": 0.004647338877642188,\n \"acc_norm\": 0.8740290778729337,\n\
\ \"acc_norm_stderr\": 0.0033113844981586464\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.041539484047424,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.041539484047424\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7828947368421053,\n \"acc_stderr\": 0.03355045304882924,\n\
\ \"acc_norm\": 0.7828947368421053,\n \"acc_norm_stderr\": 0.03355045304882924\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n\
\ \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7547169811320755,\n \"acc_stderr\": 0.026480357179895695,\n\
\ \"acc_norm\": 0.7547169811320755,\n \"acc_norm_stderr\": 0.026480357179895695\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n\
\ \"acc_stderr\": 0.03216600808802267,\n \"acc_norm\": 0.8194444444444444,\n\
\ \"acc_norm_stderr\": 0.03216600808802267\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.676595744680851,\n \"acc_stderr\": 0.03057944277361034,\n\
\ \"acc_norm\": 0.676595744680851,\n \"acc_norm_stderr\": 0.03057944277361034\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.03996629574876719,\n\
\ \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.03996629574876719\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.47354497354497355,\n \"acc_stderr\": 0.025715239811346758,\n \"\
acc_norm\": 0.47354497354497355,\n \"acc_norm_stderr\": 0.025715239811346758\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8193548387096774,\n\
\ \"acc_stderr\": 0.02188617856717253,\n \"acc_norm\": 0.8193548387096774,\n\
\ \"acc_norm_stderr\": 0.02188617856717253\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\"\
: 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781675,\n\
\ \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781675\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8888888888888888,\n \"acc_stderr\": 0.022390787638216763,\n \"\
acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.022390787638216763\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.018718998520678178,\n\
\ \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.018718998520678178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6974358974358974,\n \"acc_stderr\": 0.02329088805377272,\n \
\ \"acc_norm\": 0.6974358974358974,\n \"acc_norm_stderr\": 0.02329088805377272\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473072,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473072\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.773109243697479,\n \"acc_stderr\": 0.02720537153827947,\n \
\ \"acc_norm\": 0.773109243697479,\n \"acc_norm_stderr\": 0.02720537153827947\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4966887417218543,\n \"acc_stderr\": 0.04082393379449654,\n \"\
acc_norm\": 0.4966887417218543,\n \"acc_norm_stderr\": 0.04082393379449654\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8954128440366973,\n \"acc_stderr\": 0.013120530245265586,\n \"\
acc_norm\": 0.8954128440366973,\n \"acc_norm_stderr\": 0.013120530245265586\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5833333333333334,\n \"acc_stderr\": 0.03362277436608043,\n \"\
acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.03362277436608043\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9019607843137255,\n \"acc_stderr\": 0.020871118455552097,\n \"\
acc_norm\": 0.9019607843137255,\n \"acc_norm_stderr\": 0.020871118455552097\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.890295358649789,\n \"acc_stderr\": 0.020343400734868837,\n \
\ \"acc_norm\": 0.890295358649789,\n \"acc_norm_stderr\": 0.020343400734868837\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7623318385650224,\n\
\ \"acc_stderr\": 0.028568079464714274,\n \"acc_norm\": 0.7623318385650224,\n\
\ \"acc_norm_stderr\": 0.028568079464714274\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8396946564885496,\n \"acc_stderr\": 0.03217829420744632,\n\
\ \"acc_norm\": 0.8396946564885496,\n \"acc_norm_stderr\": 0.03217829420744632\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8512396694214877,\n \"acc_stderr\": 0.03248470083807194,\n \"\
acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.03248470083807194\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.03602814176392645,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.03602814176392645\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8282208588957055,\n \"acc_stderr\": 0.02963471727237104,\n\
\ \"acc_norm\": 0.8282208588957055,\n \"acc_norm_stderr\": 0.02963471727237104\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n\
\ \"acc_stderr\": 0.01987565502786745,\n \"acc_norm\": 0.8974358974358975,\n\
\ \"acc_norm_stderr\": 0.01987565502786745\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.859514687100894,\n\
\ \"acc_stderr\": 0.012426211353093448,\n \"acc_norm\": 0.859514687100894,\n\
\ \"acc_norm_stderr\": 0.012426211353093448\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7658959537572254,\n \"acc_stderr\": 0.022797110278071128,\n\
\ \"acc_norm\": 0.7658959537572254,\n \"acc_norm_stderr\": 0.022797110278071128\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.582122905027933,\n\
\ \"acc_stderr\": 0.016495400635820084,\n \"acc_norm\": 0.582122905027933,\n\
\ \"acc_norm_stderr\": 0.016495400635820084\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875195,\n\
\ \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875195\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7427652733118971,\n\
\ \"acc_stderr\": 0.024826171289250888,\n \"acc_norm\": 0.7427652733118971,\n\
\ \"acc_norm_stderr\": 0.024826171289250888\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8117283950617284,\n \"acc_stderr\": 0.021751866060815882,\n\
\ \"acc_norm\": 0.8117283950617284,\n \"acc_norm_stderr\": 0.021751866060815882\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.574468085106383,\n \"acc_stderr\": 0.02949482760014436,\n \
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.02949482760014436\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5788787483702738,\n\
\ \"acc_stderr\": 0.012610325733489905,\n \"acc_norm\": 0.5788787483702738,\n\
\ \"acc_norm_stderr\": 0.012610325733489905\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7242647058823529,\n \"acc_stderr\": 0.027146271936625162,\n\
\ \"acc_norm\": 0.7242647058823529,\n \"acc_norm_stderr\": 0.027146271936625162\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7565359477124183,\n \"acc_stderr\": 0.017362473762146613,\n \
\ \"acc_norm\": 0.7565359477124183,\n \"acc_norm_stderr\": 0.017362473762146613\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n\
\ \"acc_stderr\": 0.04172343038705383,\n \"acc_norm\": 0.7454545454545455,\n\
\ \"acc_norm_stderr\": 0.04172343038705383\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7959183673469388,\n \"acc_stderr\": 0.025801283475090496,\n\
\ \"acc_norm\": 0.7959183673469388,\n \"acc_norm_stderr\": 0.025801283475090496\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n\
\ \"acc_stderr\": 0.02207632610182466,\n \"acc_norm\": 0.8905472636815921,\n\
\ \"acc_norm_stderr\": 0.02207632610182466\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.0266405825391332,\n\
\ \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.0266405825391332\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.47980416156670747,\n\
\ \"mc1_stderr\": 0.01748921684973705,\n \"mc2\": 0.658093697491632,\n\
\ \"mc2_stderr\": 0.014747866760131165\n }\n}\n```"
repo_url: https://huggingface.co/oh-yeontaek/llama-2-70B-LoRA-assemble-v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|arc:challenge|25_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hellaswag|10_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-15T17-36-30.757691.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-15T17-36-30.757691.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-15T17-36-30.757691.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-15T17-36-30.757691.parquet'
- config_name: results
data_files:
- split: 2023_09_15T17_36_30.757691
path:
- results_2023-09-15T17-36-30.757691.parquet
- split: latest
path:
- results_2023-09-15T17-36-30.757691.parquet
---
# Dataset Card for Evaluation run of oh-yeontaek/llama-2-70B-LoRA-assemble-v3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/oh-yeontaek/llama-2-70B-LoRA-assemble-v3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [oh-yeontaek/llama-2-70B-LoRA-assemble-v3](https://huggingface.co/oh-yeontaek/llama-2-70B-LoRA-assemble-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_oh-yeontaek__llama-2-70B-LoRA-assemble-v3",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-15T17:36:30.757691](https://huggingface.co/datasets/open-llm-leaderboard/details_oh-yeontaek__llama-2-70B-LoRA-assemble-v3/blob/main/results_2023-09-15T17-36-30.757691.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6985803552112708,
"acc_stderr": 0.03118492094070661,
"acc_norm": 0.7024274155828159,
"acc_norm_stderr": 0.031154550420018332,
"mc1": 0.47980416156670747,
"mc1_stderr": 0.01748921684973705,
"mc2": 0.658093697491632,
"mc2_stderr": 0.014747866760131165
},
"harness|arc:challenge|25": {
"acc": 0.6860068259385665,
"acc_stderr": 0.013562691224726291,
"acc_norm": 0.7209897610921502,
"acc_norm_stderr": 0.013106784883601334
},
"harness|hellaswag|10": {
"acc": 0.6820354511053575,
"acc_stderr": 0.004647338877642188,
"acc_norm": 0.8740290778729337,
"acc_norm_stderr": 0.0033113844981586464
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.041539484047424,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.041539484047424
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7828947368421053,
"acc_stderr": 0.03355045304882924,
"acc_norm": 0.7828947368421053,
"acc_norm_stderr": 0.03355045304882924
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7547169811320755,
"acc_stderr": 0.026480357179895695,
"acc_norm": 0.7547169811320755,
"acc_norm_stderr": 0.026480357179895695
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.03216600808802267,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.03216600808802267
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.676595744680851,
"acc_stderr": 0.03057944277361034,
"acc_norm": 0.676595744680851,
"acc_norm_stderr": 0.03057944277361034
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6413793103448275,
"acc_stderr": 0.03996629574876719,
"acc_norm": 0.6413793103448275,
"acc_norm_stderr": 0.03996629574876719
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47354497354497355,
"acc_stderr": 0.025715239811346758,
"acc_norm": 0.47354497354497355,
"acc_norm_stderr": 0.025715239811346758
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8193548387096774,
"acc_stderr": 0.02188617856717253,
"acc_norm": 0.8193548387096774,
"acc_norm_stderr": 0.02188617856717253
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.541871921182266,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.541871921182266,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781675,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781675
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.022390787638216763,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.022390787638216763
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.018718998520678178,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.018718998520678178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6974358974358974,
"acc_stderr": 0.02329088805377272,
"acc_norm": 0.6974358974358974,
"acc_norm_stderr": 0.02329088805377272
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473072,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473072
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.773109243697479,
"acc_stderr": 0.02720537153827947,
"acc_norm": 0.773109243697479,
"acc_norm_stderr": 0.02720537153827947
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4966887417218543,
"acc_stderr": 0.04082393379449654,
"acc_norm": 0.4966887417218543,
"acc_norm_stderr": 0.04082393379449654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8954128440366973,
"acc_stderr": 0.013120530245265586,
"acc_norm": 0.8954128440366973,
"acc_norm_stderr": 0.013120530245265586
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.03362277436608043,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.03362277436608043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9019607843137255,
"acc_stderr": 0.020871118455552097,
"acc_norm": 0.9019607843137255,
"acc_norm_stderr": 0.020871118455552097
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.890295358649789,
"acc_stderr": 0.020343400734868837,
"acc_norm": 0.890295358649789,
"acc_norm_stderr": 0.020343400734868837
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7623318385650224,
"acc_stderr": 0.028568079464714274,
"acc_norm": 0.7623318385650224,
"acc_norm_stderr": 0.028568079464714274
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8396946564885496,
"acc_stderr": 0.03217829420744632,
"acc_norm": 0.8396946564885496,
"acc_norm_stderr": 0.03217829420744632
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8512396694214877,
"acc_stderr": 0.03248470083807194,
"acc_norm": 0.8512396694214877,
"acc_norm_stderr": 0.03248470083807194
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.03602814176392645,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.03602814176392645
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8282208588957055,
"acc_stderr": 0.02963471727237104,
"acc_norm": 0.8282208588957055,
"acc_norm_stderr": 0.02963471727237104
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.03675668832233188,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.03675668832233188
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.01987565502786745,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.01987565502786745
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.859514687100894,
"acc_stderr": 0.012426211353093448,
"acc_norm": 0.859514687100894,
"acc_norm_stderr": 0.012426211353093448
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7658959537572254,
"acc_stderr": 0.022797110278071128,
"acc_norm": 0.7658959537572254,
"acc_norm_stderr": 0.022797110278071128
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.582122905027933,
"acc_stderr": 0.016495400635820084,
"acc_norm": 0.582122905027933,
"acc_norm_stderr": 0.016495400635820084
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7427652733118971,
"acc_stderr": 0.024826171289250888,
"acc_norm": 0.7427652733118971,
"acc_norm_stderr": 0.024826171289250888
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8117283950617284,
"acc_stderr": 0.021751866060815882,
"acc_norm": 0.8117283950617284,
"acc_norm_stderr": 0.021751866060815882
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.02949482760014436,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.02949482760014436
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5788787483702738,
"acc_stderr": 0.012610325733489905,
"acc_norm": 0.5788787483702738,
"acc_norm_stderr": 0.012610325733489905
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7242647058823529,
"acc_stderr": 0.027146271936625162,
"acc_norm": 0.7242647058823529,
"acc_norm_stderr": 0.027146271936625162
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7565359477124183,
"acc_stderr": 0.017362473762146613,
"acc_norm": 0.7565359477124183,
"acc_norm_stderr": 0.017362473762146613
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.04172343038705383,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.04172343038705383
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7959183673469388,
"acc_stderr": 0.025801283475090496,
"acc_norm": 0.7959183673469388,
"acc_norm_stderr": 0.025801283475090496
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.02207632610182466,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.02207632610182466
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.0266405825391332,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.0266405825391332
},
"harness|truthfulqa:mc|0": {
"mc1": 0.47980416156670747,
"mc1_stderr": 0.01748921684973705,
"mc2": 0.658093697491632,
"mc2_stderr": 0.014747866760131165
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/kusakabe_wakaba_idolmastercinderellagirls | 2023-09-17T17:40:17.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kusakabe_wakaba (THE iDOLM@STER: Cinderella Girls)
This is the dataset of kusakabe_wakaba (THE iDOLM@STER: Cinderella Girls), containing 97 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 97 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 261 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 97 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 97 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 97 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 97 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 97 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 261 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 261 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 261 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
deepghs/anime_regular_dataset | 2023-09-15T17:57:42.000Z | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
] | deepghs | null | null | null | 0 | 0 | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
A dataset for regularization during training was created using NAI and [7eu7d7/HCP-Diffusion-datas](https://huggingface.co/datasets/7eu7d7/HCP-Diffusion-datas).
The dataset has dimensions of 512x512 and consists of 2000 images. |
CyberHarem/ujiie_mutsumi_idolmastercinderellagirls | 2023-09-17T17:40:19.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ujiie_mutsumi (THE iDOLM@STER: Cinderella Girls)
This is the dataset of ujiie_mutsumi (THE iDOLM@STER: Cinderella Girls), containing 11 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 11 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 31 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 11 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 11 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 11 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 11 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 11 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 31 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 31 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 31 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/narumiya_yume_idolmastercinderellagirls | 2023-09-17T17:40:21.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of narumiya_yume (THE iDOLM@STER: Cinderella Girls)
This is the dataset of narumiya_yume (THE iDOLM@STER: Cinderella Girls), containing 90 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 90 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 237 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 90 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 90 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 90 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 90 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 90 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 237 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 237 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 237 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/sawada_marina_idolmastercinderellagirls | 2023-09-17T17:40:23.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of sawada_marina (THE iDOLM@STER: Cinderella Girls)
This is the dataset of sawada_marina (THE iDOLM@STER: Cinderella Girls), containing 21 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 21 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 57 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 21 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 21 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 21 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 21 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 21 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 57 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 57 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 57 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/oohara_michiru_idolmastercinderellagirls | 2023-09-17T17:40:25.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of oohara_michiru (THE iDOLM@STER: Cinderella Girls)
This is the dataset of oohara_michiru (THE iDOLM@STER: Cinderella Girls), containing 34 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 34 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 86 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 34 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 34 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 34 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 34 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 34 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 86 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 86 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 86 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/sakakibara_satomi_idolmastercinderellagirls | 2023-09-17T17:40:27.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of sakakibara_satomi (THE iDOLM@STER: Cinderella Girls)
This is the dataset of sakakibara_satomi (THE iDOLM@STER: Cinderella Girls), containing 62 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 62 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 166 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 62 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 62 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 62 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 62 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 62 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 166 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 166 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 166 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
LDD5522/Rock_Vocals | 2023-09-15T18:21:08.000Z | [
"region:us"
] | LDD5522 | null | null | null | 0 | 0 | Entry not found |
CyberHarem/kiba_manami_idolmastercinderellagirls | 2023-09-17T17:40:29.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kiba_manami (THE iDOLM@STER: Cinderella Girls)
This is the dataset of kiba_manami (THE iDOLM@STER: Cinderella Girls), containing 56 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 56 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 139 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 56 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 56 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 56 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 56 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 56 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 139 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 139 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 139 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/nonomura_sora_idolmastercinderellagirls | 2023-09-17T17:40:31.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of nonomura_sora (THE iDOLM@STER: Cinderella Girls)
This is the dataset of nonomura_sora (THE iDOLM@STER: Cinderella Girls), containing 47 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 47 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 123 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 47 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 47 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 47 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 47 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 47 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 123 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 123 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 123 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
bongo2112/diamondplatnumz-SDxl-openpose-output-images | 2023-09-16T07:48:10.000Z | [
"region:us"
] | bongo2112 | null | null | null | 0 | 0 | Entry not found |
CyberHarem/munakata_atsumi_idolmastercinderellagirls | 2023-09-17T17:40:33.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of munakata_atsumi (THE iDOLM@STER: Cinderella Girls)
This is the dataset of munakata_atsumi (THE iDOLM@STER: Cinderella Girls), containing 97 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 97 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 263 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 97 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 97 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 97 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 97 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 97 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 263 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 263 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 263 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/helen_idolmastercinderellagirls | 2023-09-17T17:40:35.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of helen (THE iDOLM@STER: Cinderella Girls)
This is the dataset of helen (THE iDOLM@STER: Cinderella Girls), containing 27 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 27 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 75 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 27 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 27 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 27 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 27 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 27 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 75 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 75 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 75 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.