id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
jyothir/embedding-pythia | 2023-09-16T06:19:49.000Z | [
"region:us"
] | jyothir | null | null | null | 0 | 0 | Entry not found |
Ammar-Azman/shinjiru-blog | 2023-09-16T06:43:11.000Z | [
"license:mit",
"region:us"
] | Ammar-Azman | null | null | null | 0 | 0 | ---
license: mit
---
|
CyberHarem/shutaura_sequenzia_toarumajutsunoindex | 2023-09-17T17:42:16.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Shutaura Sequenzia
This is the dataset of Shutaura Sequenzia, containing 72 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 72 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 143 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 72 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 72 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 72 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 72 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 72 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 143 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 143 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 143 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/kitazawa_shiho_theidolmstermillionlive | 2023-09-17T17:42:18.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kitazawa_shiho (THE iDOLM@STER: Million Live!)
This is the dataset of kitazawa_shiho (THE iDOLM@STER: Million Live!), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 512 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 512 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 512 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 512 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
ChristophSchuhmann/aesthetics-v2-balanced | 2023-09-16T07:21:33.000Z | [
"region:us"
] | ChristophSchuhmann | null | null | null | 0 | 0 | Entry not found |
CyberHarem/vento_of_the_front_toarumajutsunoindex | 2023-09-17T17:42:20.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Vento of the Front
This is the dataset of Vento of the Front, containing 89 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 89 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 198 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 89 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 89 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 89 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 89 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 89 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 198 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 198 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 198 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/toyokawa_fuuka_theidolmstermillionlive | 2023-09-17T17:42:22.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of toyokawa_fuuka (THE iDOLM@STER: Million Live!)
This is the dataset of toyokawa_fuuka (THE iDOLM@STER: Million Live!), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 544 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 544 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 544 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 544 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/oriana_thomson_toarumajutsunoindex | 2023-09-17T17:42:24.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Oriana Thomson
This is the dataset of Oriana Thomson, containing 98 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 98 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 234 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 98 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 98 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 98 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 98 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 98 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 234 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 234 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 234 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/laura_stuart_toarumajutsunoindex | 2023-09-17T17:42:26.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Laura Stuart
This is the dataset of Laura Stuart, containing 78 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 78 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 160 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 78 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 78 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 78 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 78 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 78 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 160 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 160 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 160 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
linhqyy/data_aug | 2023-09-16T07:50:03.000Z | [
"region:us"
] | linhqyy | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: sentence_annotation
dtype: string
- name: intent
dtype: string
- name: entities
list:
- name: type
dtype: string
- name: filler
dtype: string
splits:
- name: train
num_bytes: 330965
num_examples: 1273
download_size: 95261
dataset_size: 330965
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "data_aug"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jordane95/berri | 2023-09-16T07:54:58.000Z | [
"region:us"
] | jordane95 | null | null | null | 0 | 0 | Entry not found |
CyberHarem/meigo_arisa_toarumajutsunoindex | 2023-09-17T17:42:28.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Meigo Arisa
This is the dataset of Meigo Arisa, containing 73 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 73 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 176 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 73 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 73 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 73 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 73 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 73 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 176 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 176 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 176 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/carissa_toarumajutsunoindex | 2023-09-17T17:42:30.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Carissa
This is the dataset of Carissa, containing 129 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 129 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 284 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 129 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 129 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 129 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 129 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 129 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 284 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 284 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 284 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
GritTin/VideoComfyUIStarter | 2023-09-21T19:42:36.000Z | [
"license:other",
"region:us"
] | GritTin | null | null | null | 0 | 0 | ---
license: other
---
|
amitness/logits-mt-it-ar-en-512 | 2023-09-16T14:37:05.000Z | [
"region:us"
] | amitness | null | null | null | 0 | 0 | Entry not found |
CyberHarem/sakuramori_kaori_theidolmstermillionlive | 2023-09-17T17:42:32.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of sakuramori_kaori (THE iDOLM@STER: Million Live!)
This is the dataset of sakuramori_kaori (THE iDOLM@STER: Million Live!), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 523 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 523 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 523 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 523 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
him009/test_dataset | 2023-09-16T08:36:14.000Z | [
"region:us"
] | him009 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: product
dtype: string
- name: description
dtype: string
- name: marketing_email
dtype: string
splits:
- name: train
num_bytes: 11384
num_examples: 6
download_size: 21140
dataset_size: 11384
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "test_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Rteqix/av4 | 2023-09-16T08:45:07.000Z | [
"region:us"
] | Rteqix | null | null | null | 0 | 0 | Entry not found |
kye/all-kye-python-code-2 | 2023-09-16T09:26:27.000Z | [
"license:mit",
"region:us"
] | kye | null | null | null | 1 | 0 | ---
license: mit
---
|
Emma92/emails | 2023-09-16T09:21:09.000Z | [
"region:us"
] | Emma92 | null | null | null | 0 | 0 | Entry not found |
CyberHarem/hakozaki_serika_theidolmstermillionlive | 2023-09-17T17:42:35.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hakozaki_serika (THE iDOLM@STER: Million Live!)
This is the dataset of hakozaki_serika (THE iDOLM@STER: Million Live!), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 544 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 544 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 544 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 544 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
Andyrasika/summary_qa | 2023-09-16T09:48:40.000Z | [
"region:us"
] | Andyrasika | null | null | null | 1 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: prompt
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 294050.25
num_examples: 420
- name: test
num_bytes: 98016.75
num_examples: 140
download_size: 211064
dataset_size: 392067.0
---
# Dataset Card for "summary_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/tanaka_kotoha_theidolmstermillionlive | 2023-09-17T17:42:37.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of tanaka_kotoha (THE iDOLM@STER: Million Live!)
This is the dataset of tanaka_kotoha (THE iDOLM@STER: Million Live!), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 529 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 529 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 529 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 529 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_ethzanalytics__pythia-31m | 2023-09-16T10:01:08.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of ethzanalytics/pythia-31m
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ethzanalytics/pythia-31m](https://huggingface.co/ethzanalytics/pythia-31m) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ethzanalytics__pythia-31m\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-16T09:59:55.108137](https://huggingface.co/datasets/open-llm-leaderboard/details_ethzanalytics__pythia-31m/blob/main/results_2023-09-16T09-59-55.108137.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24200751977402368,\n\
\ \"acc_stderr\": 0.031248763063855857,\n \"acc_norm\": 0.24229172842314425,\n\
\ \"acc_norm_stderr\": 0.03125353644698409,\n \"mc1\": 0.24724602203182375,\n\
\ \"mc1_stderr\": 0.015102404797359652,\n \"mc2\": 0.5012227342274072,\n\
\ \"mc2_stderr\": 0.01637400748739576\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.1885665529010239,\n \"acc_stderr\": 0.011430897647675785,\n\
\ \"acc_norm\": 0.19965870307167236,\n \"acc_norm_stderr\": 0.01168162575688866\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2577175861382195,\n\
\ \"acc_stderr\": 0.0043648380003356215,\n \"acc_norm\": 0.2633937462656841,\n\
\ \"acc_norm_stderr\": 0.00439573949568858\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2814814814814815,\n\
\ \"acc_stderr\": 0.03885004245800254,\n \"acc_norm\": 0.2814814814814815,\n\
\ \"acc_norm_stderr\": 0.03885004245800254\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n\
\ \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.025447863825108625,\n\
\ \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.025447863825108625\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.22916666666666666,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.22916666666666666,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n\
\ \"acc_stderr\": 0.03063114553919882,\n \"acc_norm\": 0.2023121387283237,\n\
\ \"acc_norm_stderr\": 0.03063114553919882\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.03999423879281336,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.03999423879281336\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.20689655172413793,\n \"acc_stderr\": 0.03375672449560554,\n\
\ \"acc_norm\": 0.20689655172413793,\n \"acc_norm_stderr\": 0.03375672449560554\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.21957671957671956,\n \"acc_stderr\": 0.02132001859977036,\n \"\
acc_norm\": 0.21957671957671956,\n \"acc_norm_stderr\": 0.02132001859977036\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.03932537680392873,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.03932537680392873\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036624,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036624\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.18064516129032257,\n\
\ \"acc_stderr\": 0.02188617856717254,\n \"acc_norm\": 0.18064516129032257,\n\
\ \"acc_norm_stderr\": 0.02188617856717254\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.030108330718011625,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.030108330718011625\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\"\
: 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.25757575757575757,\n \"acc_stderr\": 0.031156269519646836,\n \"\
acc_norm\": 0.25757575757575757,\n \"acc_norm_stderr\": 0.031156269519646836\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.2538860103626943,\n \"acc_stderr\": 0.03141024780565318,\n\
\ \"acc_norm\": 0.2538860103626943,\n \"acc_norm_stderr\": 0.03141024780565318\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2282051282051282,\n \"acc_stderr\": 0.02127839386358628,\n \
\ \"acc_norm\": 0.2282051282051282,\n \"acc_norm_stderr\": 0.02127839386358628\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.25210084033613445,\n \"acc_stderr\": 0.028205545033277726,\n\
\ \"acc_norm\": 0.25210084033613445,\n \"acc_norm_stderr\": 0.028205545033277726\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1944954128440367,\n \"acc_stderr\": 0.016970289090458043,\n \"\
acc_norm\": 0.1944954128440367,\n \"acc_norm_stderr\": 0.016970289090458043\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2916666666666667,\n \"acc_stderr\": 0.030998666304560534,\n \"\
acc_norm\": 0.2916666666666667,\n \"acc_norm_stderr\": 0.030998666304560534\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.03058759135160425,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.03058759135160425\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.30493273542600896,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.30493273542600896,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.24793388429752067,\n \"acc_stderr\": 0.03941897526516304,\n \"\
acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.03941897526516304\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.03916667762822586,\n\
\ \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.03916667762822586\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2863247863247863,\n\
\ \"acc_stderr\": 0.029614323690456645,\n \"acc_norm\": 0.2863247863247863,\n\
\ \"acc_norm_stderr\": 0.029614323690456645\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23627075351213284,\n\
\ \"acc_stderr\": 0.015190473717037497,\n \"acc_norm\": 0.23627075351213284,\n\
\ \"acc_norm_stderr\": 0.015190473717037497\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.23410404624277456,\n \"acc_stderr\": 0.022797110278071134,\n\
\ \"acc_norm\": 0.23410404624277456,\n \"acc_norm_stderr\": 0.022797110278071134\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23687150837988827,\n\
\ \"acc_stderr\": 0.014219570788103987,\n \"acc_norm\": 0.23687150837988827,\n\
\ \"acc_norm_stderr\": 0.014219570788103987\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.21241830065359477,\n \"acc_stderr\": 0.02342037547829613,\n\
\ \"acc_norm\": 0.21241830065359477,\n \"acc_norm_stderr\": 0.02342037547829613\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.19292604501607716,\n\
\ \"acc_stderr\": 0.022411516780911363,\n \"acc_norm\": 0.19292604501607716,\n\
\ \"acc_norm_stderr\": 0.022411516780911363\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2191358024691358,\n \"acc_stderr\": 0.0230167056402622,\n\
\ \"acc_norm\": 0.2191358024691358,\n \"acc_norm_stderr\": 0.0230167056402622\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24113475177304963,\n \"acc_stderr\": 0.025518731049537755,\n \
\ \"acc_norm\": 0.24113475177304963,\n \"acc_norm_stderr\": 0.025518731049537755\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24771838331160365,\n\
\ \"acc_stderr\": 0.011025499291443738,\n \"acc_norm\": 0.24771838331160365,\n\
\ \"acc_norm_stderr\": 0.011025499291443738\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.23897058823529413,\n \"acc_stderr\": 0.02590528064489301,\n\
\ \"acc_norm\": 0.23897058823529413,\n \"acc_norm_stderr\": 0.02590528064489301\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.23529411764705882,\n \"acc_stderr\": 0.017160587235046345,\n \
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.017160587235046345\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n\
\ \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n\
\ \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24489795918367346,\n \"acc_stderr\": 0.027529637440174917,\n\
\ \"acc_norm\": 0.24489795918367346,\n \"acc_norm_stderr\": 0.027529637440174917\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.27710843373493976,\n\
\ \"acc_stderr\": 0.03484331592680588,\n \"acc_norm\": 0.27710843373493976,\n\
\ \"acc_norm_stderr\": 0.03484331592680588\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.26900584795321636,\n \"acc_stderr\": 0.034010526201040905,\n\
\ \"acc_norm\": 0.26900584795321636,\n \"acc_norm_stderr\": 0.034010526201040905\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24724602203182375,\n\
\ \"mc1_stderr\": 0.015102404797359652,\n \"mc2\": 0.5012227342274072,\n\
\ \"mc2_stderr\": 0.01637400748739576\n }\n}\n```"
repo_url: https://huggingface.co/ethzanalytics/pythia-31m
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|arc:challenge|25_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hellaswag|10_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-16T09-59-55.108137.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-16T09-59-55.108137.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-16T09-59-55.108137.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-16T09-59-55.108137.parquet'
- config_name: results
data_files:
- split: 2023_09_16T09_59_55.108137
path:
- results_2023-09-16T09-59-55.108137.parquet
- split: latest
path:
- results_2023-09-16T09-59-55.108137.parquet
---
# Dataset Card for Evaluation run of ethzanalytics/pythia-31m
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ethzanalytics/pythia-31m
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ethzanalytics/pythia-31m](https://huggingface.co/ethzanalytics/pythia-31m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ethzanalytics__pythia-31m",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T09:59:55.108137](https://huggingface.co/datasets/open-llm-leaderboard/details_ethzanalytics__pythia-31m/blob/main/results_2023-09-16T09-59-55.108137.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24200751977402368,
"acc_stderr": 0.031248763063855857,
"acc_norm": 0.24229172842314425,
"acc_norm_stderr": 0.03125353644698409,
"mc1": 0.24724602203182375,
"mc1_stderr": 0.015102404797359652,
"mc2": 0.5012227342274072,
"mc2_stderr": 0.01637400748739576
},
"harness|arc:challenge|25": {
"acc": 0.1885665529010239,
"acc_stderr": 0.011430897647675785,
"acc_norm": 0.19965870307167236,
"acc_norm_stderr": 0.01168162575688866
},
"harness|hellaswag|10": {
"acc": 0.2577175861382195,
"acc_stderr": 0.0043648380003356215,
"acc_norm": 0.2633937462656841,
"acc_norm_stderr": 0.00439573949568858
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.03885004245800254,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.03885004245800254
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2188679245283019,
"acc_stderr": 0.025447863825108625,
"acc_norm": 0.2188679245283019,
"acc_norm_stderr": 0.025447863825108625
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.22916666666666666,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.22916666666666666,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.03063114553919882,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.03063114553919882
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281336,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281336
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.20689655172413793,
"acc_stderr": 0.03375672449560554,
"acc_norm": 0.20689655172413793,
"acc_norm_stderr": 0.03375672449560554
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.21957671957671956,
"acc_stderr": 0.02132001859977036,
"acc_norm": 0.21957671957671956,
"acc_norm_stderr": 0.02132001859977036
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.03932537680392873,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.03932537680392873
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.18064516129032257,
"acc_stderr": 0.02188617856717254,
"acc_norm": 0.18064516129032257,
"acc_norm_stderr": 0.02188617856717254
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.030108330718011625,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.030108330718011625
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.25757575757575757,
"acc_stderr": 0.031156269519646836,
"acc_norm": 0.25757575757575757,
"acc_norm_stderr": 0.031156269519646836
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2538860103626943,
"acc_stderr": 0.03141024780565318,
"acc_norm": 0.2538860103626943,
"acc_norm_stderr": 0.03141024780565318
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2282051282051282,
"acc_stderr": 0.02127839386358628,
"acc_norm": 0.2282051282051282,
"acc_norm_stderr": 0.02127839386358628
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.25210084033613445,
"acc_stderr": 0.028205545033277726,
"acc_norm": 0.25210084033613445,
"acc_norm_stderr": 0.028205545033277726
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1944954128440367,
"acc_stderr": 0.016970289090458043,
"acc_norm": 0.1944954128440367,
"acc_norm_stderr": 0.016970289090458043
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.030998666304560534,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.030998666304560534
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.03058759135160425,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.03058759135160425
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.30493273542600896,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.30493273542600896,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.03941897526516304,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.03941897526516304
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.03916667762822586,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.03916667762822586
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2863247863247863,
"acc_stderr": 0.029614323690456645,
"acc_norm": 0.2863247863247863,
"acc_norm_stderr": 0.029614323690456645
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23627075351213284,
"acc_stderr": 0.015190473717037497,
"acc_norm": 0.23627075351213284,
"acc_norm_stderr": 0.015190473717037497
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23410404624277456,
"acc_stderr": 0.022797110278071134,
"acc_norm": 0.23410404624277456,
"acc_norm_stderr": 0.022797110278071134
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23687150837988827,
"acc_stderr": 0.014219570788103987,
"acc_norm": 0.23687150837988827,
"acc_norm_stderr": 0.014219570788103987
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.21241830065359477,
"acc_stderr": 0.02342037547829613,
"acc_norm": 0.21241830065359477,
"acc_norm_stderr": 0.02342037547829613
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.19292604501607716,
"acc_stderr": 0.022411516780911363,
"acc_norm": 0.19292604501607716,
"acc_norm_stderr": 0.022411516780911363
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2191358024691358,
"acc_stderr": 0.0230167056402622,
"acc_norm": 0.2191358024691358,
"acc_norm_stderr": 0.0230167056402622
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24113475177304963,
"acc_stderr": 0.025518731049537755,
"acc_norm": 0.24113475177304963,
"acc_norm_stderr": 0.025518731049537755
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24771838331160365,
"acc_stderr": 0.011025499291443738,
"acc_norm": 0.24771838331160365,
"acc_norm_stderr": 0.011025499291443738
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.23897058823529413,
"acc_stderr": 0.02590528064489301,
"acc_norm": 0.23897058823529413,
"acc_norm_stderr": 0.02590528064489301
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.017160587235046345,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.017160587235046345
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24489795918367346,
"acc_stderr": 0.027529637440174917,
"acc_norm": 0.24489795918367346,
"acc_norm_stderr": 0.027529637440174917
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-virology|5": {
"acc": 0.27710843373493976,
"acc_stderr": 0.03484331592680588,
"acc_norm": 0.27710843373493976,
"acc_norm_stderr": 0.03484331592680588
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.26900584795321636,
"acc_stderr": 0.034010526201040905,
"acc_norm": 0.26900584795321636,
"acc_norm_stderr": 0.034010526201040905
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24724602203182375,
"mc1_stderr": 0.015102404797359652,
"mc2": 0.5012227342274072,
"mc2_stderr": 0.01637400748739576
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_player1537__dolphinette | 2023-09-16T10:15:11.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of player1537/dolphinette
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [player1537/dolphinette](https://huggingface.co/player1537/dolphinette) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_player1537__dolphinette\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-16T10:13:56.989865](https://huggingface.co/datasets/open-llm-leaderboard/details_player1537__dolphinette/blob/main/results_2023-09-16T10-13-56.989865.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.254255565472104,\n\
\ \"acc_stderr\": 0.031540212755071906,\n \"acc_norm\": 0.25569367419777816,\n\
\ \"acc_norm_stderr\": 0.0315513909799735,\n \"mc1\": 0.26193390452876375,\n\
\ \"mc1_stderr\": 0.01539211880501503,\n \"mc2\": 0.4207661403631234,\n\
\ \"mc2_stderr\": 0.015180806324975694\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2235494880546075,\n \"acc_stderr\": 0.012174896631202607,\n\
\ \"acc_norm\": 0.24914675767918087,\n \"acc_norm_stderr\": 0.012639407111926439\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3140808603863772,\n\
\ \"acc_stderr\": 0.004632001732332983,\n \"acc_norm\": 0.37333200557657836,\n\
\ \"acc_norm_stderr\": 0.0048270065208028835\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
\ \"acc_stderr\": 0.03749850709174023,\n \"acc_norm\": 0.2518518518518518,\n\
\ \"acc_norm_stderr\": 0.03749850709174023\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2565789473684211,\n \"acc_stderr\": 0.035541803680256896,\n\
\ \"acc_norm\": 0.2565789473684211,\n \"acc_norm_stderr\": 0.035541803680256896\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2528301886792453,\n \"acc_stderr\": 0.026749899771241238,\n\
\ \"acc_norm\": 0.2528301886792453,\n \"acc_norm_stderr\": 0.026749899771241238\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.22916666666666666,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.22916666666666666,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n\
\ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n\
\ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.047240073523838896,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.047240073523838896\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2425531914893617,\n \"acc_stderr\": 0.02802022627120022,\n\
\ \"acc_norm\": 0.2425531914893617,\n \"acc_norm_stderr\": 0.02802022627120022\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n\
\ \"acc_stderr\": 0.03892431106518754,\n \"acc_norm\": 0.21929824561403508,\n\
\ \"acc_norm_stderr\": 0.03892431106518754\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.036646663372252565,\n\
\ \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.036646663372252565\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24867724867724866,\n \"acc_stderr\": 0.022261817692400168,\n \"\
acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.022261817692400168\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n\
\ \"acc_stderr\": 0.03764950879790604,\n \"acc_norm\": 0.23015873015873015,\n\
\ \"acc_norm_stderr\": 0.03764950879790604\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036847,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036847\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3096774193548387,\n\
\ \"acc_stderr\": 0.026302774983517414,\n \"acc_norm\": 0.3096774193548387,\n\
\ \"acc_norm_stderr\": 0.026302774983517414\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.30049261083743845,\n \"acc_stderr\": 0.03225799476233484,\n\
\ \"acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.03225799476233484\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.03477691162163659,\n\
\ \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.03477691162163659\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2222222222222222,\n \"acc_stderr\": 0.029620227874790482,\n \"\
acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.029620227874790482\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.33678756476683935,\n \"acc_stderr\": 0.03410780251836183,\n\
\ \"acc_norm\": 0.33678756476683935,\n \"acc_norm_stderr\": 0.03410780251836183\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.25384615384615383,\n \"acc_stderr\": 0.022066054378726257,\n\
\ \"acc_norm\": 0.25384615384615383,\n \"acc_norm_stderr\": 0.022066054378726257\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073828,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073828\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3487394957983193,\n \"acc_stderr\": 0.03095663632856655,\n \
\ \"acc_norm\": 0.3487394957983193,\n \"acc_norm_stderr\": 0.03095663632856655\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.25504587155963304,\n\
\ \"acc_stderr\": 0.018688500856535832,\n \"acc_norm\": 0.25504587155963304,\n\
\ \"acc_norm_stderr\": 0.018688500856535832\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321617,\n\
\ \"acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321617\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2647058823529412,\n \"acc_stderr\": 0.03096451792692341,\n \"\
acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.03096451792692341\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.25316455696202533,\n \"acc_stderr\": 0.02830465794303531,\n \
\ \"acc_norm\": 0.25316455696202533,\n \"acc_norm_stderr\": 0.02830465794303531\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.17937219730941703,\n\
\ \"acc_stderr\": 0.025749819569192794,\n \"acc_norm\": 0.17937219730941703,\n\
\ \"acc_norm_stderr\": 0.025749819569192794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.24793388429752067,\n \"acc_stderr\": 0.03941897526516303,\n \"\
acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.03941897526516303\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n\
\ \"acc_stderr\": 0.041331194402438376,\n \"acc_norm\": 0.24074074074074073,\n\
\ \"acc_norm_stderr\": 0.041331194402438376\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.17857142857142858,\n\
\ \"acc_stderr\": 0.03635209121577806,\n \"acc_norm\": 0.17857142857142858,\n\
\ \"acc_norm_stderr\": 0.03635209121577806\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.18803418803418803,\n\
\ \"acc_stderr\": 0.025598193686652265,\n \"acc_norm\": 0.18803418803418803,\n\
\ \"acc_norm_stderr\": 0.025598193686652265\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24776500638569604,\n\
\ \"acc_stderr\": 0.015438083080568972,\n \"acc_norm\": 0.24776500638569604,\n\
\ \"acc_norm_stderr\": 0.015438083080568972\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.21965317919075145,\n \"acc_stderr\": 0.02228963885261791,\n\
\ \"acc_norm\": 0.21965317919075145,\n \"acc_norm_stderr\": 0.02228963885261791\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2536312849162011,\n\
\ \"acc_stderr\": 0.014551553659369922,\n \"acc_norm\": 0.2536312849162011,\n\
\ \"acc_norm_stderr\": 0.014551553659369922\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02564686309713791,\n\
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02564686309713791\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2347266881028939,\n\
\ \"acc_stderr\": 0.02407180588767705,\n \"acc_norm\": 0.2347266881028939,\n\
\ \"acc_norm_stderr\": 0.02407180588767705\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25308641975308643,\n \"acc_stderr\": 0.024191808600713002,\n\
\ \"acc_norm\": 0.25308641975308643,\n \"acc_norm_stderr\": 0.024191808600713002\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432414,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432414\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2392438070404172,\n\
\ \"acc_stderr\": 0.01089612365267665,\n \"acc_norm\": 0.2392438070404172,\n\
\ \"acc_norm_stderr\": 0.01089612365267665\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.23897058823529413,\n \"acc_stderr\": 0.02590528064489301,\n\
\ \"acc_norm\": 0.23897058823529413,\n \"acc_norm_stderr\": 0.02590528064489301\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.22712418300653595,\n \"acc_stderr\": 0.016949853279212376,\n \
\ \"acc_norm\": 0.22712418300653595,\n \"acc_norm_stderr\": 0.016949853279212376\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n\
\ \"acc_stderr\": 0.03895091015724137,\n \"acc_norm\": 0.20909090909090908,\n\
\ \"acc_norm_stderr\": 0.03895091015724137\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24081632653061225,\n \"acc_stderr\": 0.02737294220178817,\n\
\ \"acc_norm\": 0.24081632653061225,\n \"acc_norm_stderr\": 0.02737294220178817\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.27860696517412936,\n\
\ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.27860696517412936,\n\
\ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.18072289156626506,\n\
\ \"acc_stderr\": 0.029955737855810138,\n \"acc_norm\": 0.18072289156626506,\n\
\ \"acc_norm_stderr\": 0.029955737855810138\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.23391812865497075,\n \"acc_stderr\": 0.03246721765117825,\n\
\ \"acc_norm\": 0.23391812865497075,\n \"acc_norm_stderr\": 0.03246721765117825\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26193390452876375,\n\
\ \"mc1_stderr\": 0.01539211880501503,\n \"mc2\": 0.4207661403631234,\n\
\ \"mc2_stderr\": 0.015180806324975694\n }\n}\n```"
repo_url: https://huggingface.co/player1537/dolphinette
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|arc:challenge|25_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hellaswag|10_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-16T10-13-56.989865.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-16T10-13-56.989865.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-16T10-13-56.989865.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-16T10-13-56.989865.parquet'
- config_name: results
data_files:
- split: 2023_09_16T10_13_56.989865
path:
- results_2023-09-16T10-13-56.989865.parquet
- split: latest
path:
- results_2023-09-16T10-13-56.989865.parquet
---
# Dataset Card for Evaluation run of player1537/dolphinette
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/player1537/dolphinette
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [player1537/dolphinette](https://huggingface.co/player1537/dolphinette) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_player1537__dolphinette",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T10:13:56.989865](https://huggingface.co/datasets/open-llm-leaderboard/details_player1537__dolphinette/blob/main/results_2023-09-16T10-13-56.989865.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.254255565472104,
"acc_stderr": 0.031540212755071906,
"acc_norm": 0.25569367419777816,
"acc_norm_stderr": 0.0315513909799735,
"mc1": 0.26193390452876375,
"mc1_stderr": 0.01539211880501503,
"mc2": 0.4207661403631234,
"mc2_stderr": 0.015180806324975694
},
"harness|arc:challenge|25": {
"acc": 0.2235494880546075,
"acc_stderr": 0.012174896631202607,
"acc_norm": 0.24914675767918087,
"acc_norm_stderr": 0.012639407111926439
},
"harness|hellaswag|10": {
"acc": 0.3140808603863772,
"acc_stderr": 0.004632001732332983,
"acc_norm": 0.37333200557657836,
"acc_norm_stderr": 0.0048270065208028835
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.03749850709174023,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.03749850709174023
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2565789473684211,
"acc_stderr": 0.035541803680256896,
"acc_norm": 0.2565789473684211,
"acc_norm_stderr": 0.035541803680256896
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2528301886792453,
"acc_stderr": 0.026749899771241238,
"acc_norm": 0.2528301886792453,
"acc_norm_stderr": 0.026749899771241238
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.22916666666666666,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.22916666666666666,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.047240073523838896,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.047240073523838896
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2425531914893617,
"acc_stderr": 0.02802022627120022,
"acc_norm": 0.2425531914893617,
"acc_norm_stderr": 0.02802022627120022
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.03892431106518754,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.03892431106518754
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2620689655172414,
"acc_stderr": 0.036646663372252565,
"acc_norm": 0.2620689655172414,
"acc_norm_stderr": 0.036646663372252565
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.022261817692400168,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.022261817692400168
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.03764950879790604,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.03764950879790604
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036847,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036847
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3096774193548387,
"acc_stderr": 0.026302774983517414,
"acc_norm": 0.3096774193548387,
"acc_norm_stderr": 0.026302774983517414
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.03225799476233484,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.03225799476233484
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.03477691162163659,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.03477691162163659
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.029620227874790482,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.029620227874790482
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.33678756476683935,
"acc_stderr": 0.03410780251836183,
"acc_norm": 0.33678756476683935,
"acc_norm_stderr": 0.03410780251836183
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.25384615384615383,
"acc_stderr": 0.022066054378726257,
"acc_norm": 0.25384615384615383,
"acc_norm_stderr": 0.022066054378726257
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073828,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073828
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3487394957983193,
"acc_stderr": 0.03095663632856655,
"acc_norm": 0.3487394957983193,
"acc_norm_stderr": 0.03095663632856655
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.25504587155963304,
"acc_stderr": 0.018688500856535832,
"acc_norm": 0.25504587155963304,
"acc_norm_stderr": 0.018688500856535832
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.03096451792692341,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.03096451792692341
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25316455696202533,
"acc_stderr": 0.02830465794303531,
"acc_norm": 0.25316455696202533,
"acc_norm_stderr": 0.02830465794303531
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.17937219730941703,
"acc_stderr": 0.025749819569192794,
"acc_norm": 0.17937219730941703,
"acc_norm_stderr": 0.025749819569192794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.041331194402438376,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.041331194402438376
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.17857142857142858,
"acc_stderr": 0.03635209121577806,
"acc_norm": 0.17857142857142858,
"acc_norm_stderr": 0.03635209121577806
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.18803418803418803,
"acc_stderr": 0.025598193686652265,
"acc_norm": 0.18803418803418803,
"acc_norm_stderr": 0.025598193686652265
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24776500638569604,
"acc_stderr": 0.015438083080568972,
"acc_norm": 0.24776500638569604,
"acc_norm_stderr": 0.015438083080568972
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.02228963885261791,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.02228963885261791
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2536312849162011,
"acc_stderr": 0.014551553659369922,
"acc_norm": 0.2536312849162011,
"acc_norm_stderr": 0.014551553659369922
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02564686309713791,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02564686309713791
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2347266881028939,
"acc_stderr": 0.02407180588767705,
"acc_norm": 0.2347266881028939,
"acc_norm_stderr": 0.02407180588767705
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25308641975308643,
"acc_stderr": 0.024191808600713002,
"acc_norm": 0.25308641975308643,
"acc_norm_stderr": 0.024191808600713002
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432414,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432414
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2392438070404172,
"acc_stderr": 0.01089612365267665,
"acc_norm": 0.2392438070404172,
"acc_norm_stderr": 0.01089612365267665
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.23897058823529413,
"acc_stderr": 0.02590528064489301,
"acc_norm": 0.23897058823529413,
"acc_norm_stderr": 0.02590528064489301
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.22712418300653595,
"acc_stderr": 0.016949853279212376,
"acc_norm": 0.22712418300653595,
"acc_norm_stderr": 0.016949853279212376
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.03895091015724137,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.03895091015724137
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24081632653061225,
"acc_stderr": 0.02737294220178817,
"acc_norm": 0.24081632653061225,
"acc_norm_stderr": 0.02737294220178817
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.27860696517412936,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.27860696517412936,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-virology|5": {
"acc": 0.18072289156626506,
"acc_stderr": 0.029955737855810138,
"acc_norm": 0.18072289156626506,
"acc_norm_stderr": 0.029955737855810138
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.23391812865497075,
"acc_stderr": 0.03246721765117825,
"acc_norm": 0.23391812865497075,
"acc_norm_stderr": 0.03246721765117825
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26193390452876375,
"mc1_stderr": 0.01539211880501503,
"mc2": 0.4207661403631234,
"mc2_stderr": 0.015180806324975694
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Beetho/Trucks-Detection-Yolov8 | 2023-09-16T11:19:48.000Z | [
"task_categories:object-detection",
"size_categories:n<1K",
"language:en",
"language:fr",
"language:de",
"language:it",
"language:es",
"language:ca",
"license:cc-by-3.0",
"region:us"
] | Beetho | null | null | null | 0 | 0 | ---
license: cc-by-3.0
task_categories:
- object-detection
language:
- en
- fr
- de
- it
- es
- ca
size_categories:
- n<1K
---
Trucks Detection - v1
==============================
This dataset was exported via roboflow.com on September 11, 2023 at 8:38 AM GMT
Roboflow is an end-to-end computer vision platform that helps you
* collaborate with your team on computer vision projects
* collect & organize images
* understand and search unstructured image data
* annotate, and create datasets
* export, train, and deploy computer vision models
* use active learning to improve your dataset over time
The dataset includes 746 images.
Trucks are annotated in YOLOv8 format.
The following pre-processing was applied to each image:
* Auto-orientation of pixel data (with EXIF-orientation stripping)
* Resize to 640x640 (Stretch)
* Grayscale (CRT phosphor)
The following augmentation was applied to create 3 versions of each source image:
* Random Gaussian blur of between 0 and 1.5 pixels
* Salt and pepper noise was applied to 5 percent of pixels |
khizarhussainn/your_dataset_name | 2023-09-16T10:31:55.000Z | [
"region:us"
] | khizarhussainn | null | null | null | 0 | 0 | Entry not found |
piyush23111991/clinicalTrial | 2023-09-16T11:41:41.000Z | [
"region:us"
] | piyush23111991 | null | null | null | 0 | 0 | Entry not found |
quocanh34/NLU_aug | 2023-09-16T10:37:43.000Z | [
"region:us"
] | quocanh34 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: intent
dtype: string
- name: entities
list:
- name: type
dtype: string
- name: filler
dtype: string
- name: file
dtype: string
splits:
- name: train
num_bytes: 163690
num_examples: 1299
download_size: 51331
dataset_size: 163690
---
# Dataset Card for "NLU_aug"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jerry-Master/lung-tumour-study | 2023-09-16T12:04:37.000Z | [
"license:cc-by-nc-4.0",
"region:us"
] | Jerry-Master | null | null | null | 0 | 0 | ---
license: cc-by-nc-4.0
---
# Combining graph neural networks and computer vision methods for cell nuclei classification in lung tissue
This is the dataset of the article in the title. It contains 85 patches of 1024x1024 pixels from H&E stained WSIs of 9 different patients. It contains two main classes: tumoural (2) and non tumoural (1). Due to the difficulty of the problem, 153 cells were labelled as uncertain. For technical reasons, we decided to eliminate them in the train and validation set and we carefully chose the test set so that it included no uncertain cell. In total there are 21255 cells in the train set, 4114 in the validation set and 5533 in the test set. We manually reviewed that no patient is in two splits at the same time, ensuring that the split has no data leakage in any way.
## Structure
The data is provided in several ways. In the orig folder you have the images without any annotation. Later in overlay the same images with the cells overlayed on top are provided for visualization purposes being red healthy cells and green the tumoural ones. Annotations were made using a software called QuPath, the raw geojson files extracted from the application are in raw_geojson. However, bear in mind that it may contain duplicated cells and uncertain cells. We are releasing it together with the scripts in the scripts folder so that any interested researcher can load the annotations back into QuPath and review the labels. If you, as an expert, believe we have incorrectly labelled some cells, please, feel free to contact us. The rest of the folders (train, test, validation) contain the data ready to use and with the same structure as specified in the [tumourkit package documentation](https://lung-tumour-study.readthedocs.io/en/latest/usage.html#make-dirs). Just move them into the data folder. Notice you will need to move the orig folder too.
Any pred or hov folder is provided as an example. They contain predictions from one of our models. If you were to train your own models, you should delete them. Also, the npy folders are crops of the original images of size 518x518. You can train Hovernet with other shapes if you want by modifying the code provided by the [Tumourkit library](https://github.com/Jerry-Master/lung-tumour-study). |
TinyPixel/airoboros-2.2 | 2023-09-16T10:42:07.000Z | [
"region:us"
] | TinyPixel | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 171010250
num_examples: 88240
download_size: 94788763
dataset_size: 171010250
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "airoboros-2.2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cym31152/corn | 2023-09-16T10:53:35.000Z | [
"region:us"
] | cym31152 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_aqweteddy__Tulpar-tv_marcoroni-7b | 2023-09-16T11:06:55.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of aqweteddy/Tulpar-tv_marcoroni-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [aqweteddy/Tulpar-tv_marcoroni-7b](https://huggingface.co/aqweteddy/Tulpar-tv_marcoroni-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aqweteddy__Tulpar-tv_marcoroni-7b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-16T11:05:38.004815](https://huggingface.co/datasets/open-llm-leaderboard/details_aqweteddy__Tulpar-tv_marcoroni-7b/blob/main/results_2023-09-16T11-05-38.004815.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3312563883942805,\n\
\ \"acc_stderr\": 0.03372195940077684,\n \"acc_norm\": 0.33458244613980964,\n\
\ \"acc_norm_stderr\": 0.0337194423696009,\n \"mc1\": 0.30354957160342716,\n\
\ \"mc1_stderr\": 0.016095884155386847,\n \"mc2\": 0.4937561621069656,\n\
\ \"mc2_stderr\": 0.016106089320397136\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.38993174061433444,\n \"acc_stderr\": 0.014252959848892877,\n\
\ \"acc_norm\": 0.41638225255972694,\n \"acc_norm_stderr\": 0.01440561827943617\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5012945628360884,\n\
\ \"acc_stderr\": 0.0049897646867388306,\n \"acc_norm\": 0.671081457876917,\n\
\ \"acc_norm_stderr\": 0.004688601416815203\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3925925925925926,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.3925925925925926,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.29605263157894735,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.29605263157894735,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3660377358490566,\n \"acc_stderr\": 0.02964781353936524,\n\
\ \"acc_norm\": 0.3660377358490566,\n \"acc_norm_stderr\": 0.02964781353936524\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.039420826399272135,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.039420826399272135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.14,\n \"acc_stderr\": 0.03487350880197772,\n \"acc_norm\": 0.14,\n\
\ \"acc_norm_stderr\": 0.03487350880197772\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.26011560693641617,\n\
\ \"acc_stderr\": 0.03345036916788991,\n \"acc_norm\": 0.26011560693641617,\n\
\ \"acc_norm_stderr\": 0.03345036916788991\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.04158307533083286,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.04158307533083286\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3446808510638298,\n \"acc_stderr\": 0.03106898596312215,\n\
\ \"acc_norm\": 0.3446808510638298,\n \"acc_norm_stderr\": 0.03106898596312215\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.03695183311650232,\n\
\ \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.03695183311650232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.02256989707491842,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02256989707491842\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n\
\ \"acc_stderr\": 0.036196045241242515,\n \"acc_norm\": 0.20634920634920634,\n\
\ \"acc_norm_stderr\": 0.036196045241242515\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.36451612903225805,\n \"acc_stderr\": 0.02737987122994324,\n \"\
acc_norm\": 0.36451612903225805,\n \"acc_norm_stderr\": 0.02737987122994324\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n \"\
acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\"\
: 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.44242424242424244,\n \"acc_stderr\": 0.03878372113711275,\n\
\ \"acc_norm\": 0.44242424242424244,\n \"acc_norm_stderr\": 0.03878372113711275\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.30303030303030304,\n \"acc_stderr\": 0.03274287914026868,\n \"\
acc_norm\": 0.30303030303030304,\n \"acc_norm_stderr\": 0.03274287914026868\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3160621761658031,\n \"acc_stderr\": 0.03355397369686174,\n\
\ \"acc_norm\": 0.3160621761658031,\n \"acc_norm_stderr\": 0.03355397369686174\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.24358974358974358,\n \"acc_stderr\": 0.021763733684173933,\n\
\ \"acc_norm\": 0.24358974358974358,\n \"acc_norm_stderr\": 0.021763733684173933\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.22592592592592592,\n \"acc_stderr\": 0.02549753263960955,\n \
\ \"acc_norm\": 0.22592592592592592,\n \"acc_norm_stderr\": 0.02549753263960955\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23949579831932774,\n \"acc_stderr\": 0.02772206549336127,\n\
\ \"acc_norm\": 0.23949579831932774,\n \"acc_norm_stderr\": 0.02772206549336127\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.24503311258278146,\n \"acc_stderr\": 0.03511807571804724,\n \"\
acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.03511807571804724\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3504587155963303,\n \"acc_stderr\": 0.020456077599824457,\n \"\
acc_norm\": 0.3504587155963303,\n \"acc_norm_stderr\": 0.020456077599824457\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.19444444444444445,\n \"acc_stderr\": 0.02699145450203673,\n \"\
acc_norm\": 0.19444444444444445,\n \"acc_norm_stderr\": 0.02699145450203673\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.44607843137254904,\n \"acc_stderr\": 0.03488845451304974,\n \"\
acc_norm\": 0.44607843137254904,\n \"acc_norm_stderr\": 0.03488845451304974\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5316455696202531,\n \"acc_stderr\": 0.032481974005110756,\n \
\ \"acc_norm\": 0.5316455696202531,\n \"acc_norm_stderr\": 0.032481974005110756\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.38565022421524664,\n\
\ \"acc_stderr\": 0.03266842214289202,\n \"acc_norm\": 0.38565022421524664,\n\
\ \"acc_norm_stderr\": 0.03266842214289202\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.33587786259541985,\n \"acc_stderr\": 0.041423137719966634,\n\
\ \"acc_norm\": 0.33587786259541985,\n \"acc_norm_stderr\": 0.041423137719966634\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.36363636363636365,\n \"acc_stderr\": 0.043913262867240704,\n \"\
acc_norm\": 0.36363636363636365,\n \"acc_norm_stderr\": 0.043913262867240704\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3148148148148148,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.3148148148148148,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.33980582524271846,\n \"acc_stderr\": 0.04689765937278131,\n\
\ \"acc_norm\": 0.33980582524271846,\n \"acc_norm_stderr\": 0.04689765937278131\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.4658119658119658,\n\
\ \"acc_stderr\": 0.03267942734081228,\n \"acc_norm\": 0.4658119658119658,\n\
\ \"acc_norm_stderr\": 0.03267942734081228\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.47381864623243936,\n\
\ \"acc_stderr\": 0.017855434554041986,\n \"acc_norm\": 0.47381864623243936,\n\
\ \"acc_norm_stderr\": 0.017855434554041986\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3179190751445087,\n \"acc_stderr\": 0.025070713719153183,\n\
\ \"acc_norm\": 0.3179190751445087,\n \"acc_norm_stderr\": 0.025070713719153183\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.30718954248366015,\n \"acc_stderr\": 0.026415601914388995,\n\
\ \"acc_norm\": 0.30718954248366015,\n \"acc_norm_stderr\": 0.026415601914388995\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.31189710610932475,\n\
\ \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.31189710610932475,\n\
\ \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4104938271604938,\n \"acc_stderr\": 0.027371350925124764,\n\
\ \"acc_norm\": 0.4104938271604938,\n \"acc_norm_stderr\": 0.027371350925124764\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24822695035460993,\n \"acc_stderr\": 0.025770015644290396,\n \
\ \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.025770015644290396\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.28878748370273793,\n\
\ \"acc_stderr\": 0.011574914757219962,\n \"acc_norm\": 0.28878748370273793,\n\
\ \"acc_norm_stderr\": 0.011574914757219962\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.02841820861940679,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.02841820861940679\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.34477124183006536,\n \"acc_stderr\": 0.01922832201869664,\n \
\ \"acc_norm\": 0.34477124183006536,\n \"acc_norm_stderr\": 0.01922832201869664\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4090909090909091,\n\
\ \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.4090909090909091,\n\
\ \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3469387755102041,\n \"acc_stderr\": 0.0304725260267265,\n\
\ \"acc_norm\": 0.3469387755102041,\n \"acc_norm_stderr\": 0.0304725260267265\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.31343283582089554,\n\
\ \"acc_stderr\": 0.032801882053486435,\n \"acc_norm\": 0.31343283582089554,\n\
\ \"acc_norm_stderr\": 0.032801882053486435\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.37349397590361444,\n\
\ \"acc_stderr\": 0.037658451171688624,\n \"acc_norm\": 0.37349397590361444,\n\
\ \"acc_norm_stderr\": 0.037658451171688624\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4678362573099415,\n \"acc_stderr\": 0.03826882417660368,\n\
\ \"acc_norm\": 0.4678362573099415,\n \"acc_norm_stderr\": 0.03826882417660368\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30354957160342716,\n\
\ \"mc1_stderr\": 0.016095884155386847,\n \"mc2\": 0.4937561621069656,\n\
\ \"mc2_stderr\": 0.016106089320397136\n }\n}\n```"
repo_url: https://huggingface.co/aqweteddy/Tulpar-tv_marcoroni-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|arc:challenge|25_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hellaswag|10_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-16T11-05-38.004815.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-16T11-05-38.004815.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-16T11-05-38.004815.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-16T11-05-38.004815.parquet'
- config_name: results
data_files:
- split: 2023_09_16T11_05_38.004815
path:
- results_2023-09-16T11-05-38.004815.parquet
- split: latest
path:
- results_2023-09-16T11-05-38.004815.parquet
---
# Dataset Card for Evaluation run of aqweteddy/Tulpar-tv_marcoroni-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/aqweteddy/Tulpar-tv_marcoroni-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [aqweteddy/Tulpar-tv_marcoroni-7b](https://huggingface.co/aqweteddy/Tulpar-tv_marcoroni-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_aqweteddy__Tulpar-tv_marcoroni-7b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T11:05:38.004815](https://huggingface.co/datasets/open-llm-leaderboard/details_aqweteddy__Tulpar-tv_marcoroni-7b/blob/main/results_2023-09-16T11-05-38.004815.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3312563883942805,
"acc_stderr": 0.03372195940077684,
"acc_norm": 0.33458244613980964,
"acc_norm_stderr": 0.0337194423696009,
"mc1": 0.30354957160342716,
"mc1_stderr": 0.016095884155386847,
"mc2": 0.4937561621069656,
"mc2_stderr": 0.016106089320397136
},
"harness|arc:challenge|25": {
"acc": 0.38993174061433444,
"acc_stderr": 0.014252959848892877,
"acc_norm": 0.41638225255972694,
"acc_norm_stderr": 0.01440561827943617
},
"harness|hellaswag|10": {
"acc": 0.5012945628360884,
"acc_stderr": 0.0049897646867388306,
"acc_norm": 0.671081457876917,
"acc_norm_stderr": 0.004688601416815203
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3925925925925926,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.3925925925925926,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.29605263157894735,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.29605263157894735,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3660377358490566,
"acc_stderr": 0.02964781353936524,
"acc_norm": 0.3660377358490566,
"acc_norm_stderr": 0.02964781353936524
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.039420826399272135,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.039420826399272135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.14,
"acc_stderr": 0.03487350880197772,
"acc_norm": 0.14,
"acc_norm_stderr": 0.03487350880197772
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.03345036916788991,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.03345036916788991
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.04158307533083286,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.04158307533083286
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3446808510638298,
"acc_stderr": 0.03106898596312215,
"acc_norm": 0.3446808510638298,
"acc_norm_stderr": 0.03106898596312215
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489362,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489362
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.03695183311650232,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.03695183311650232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02256989707491842,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02256989707491842
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.036196045241242515,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.036196045241242515
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.36451612903225805,
"acc_stderr": 0.02737987122994324,
"acc_norm": 0.36451612903225805,
"acc_norm_stderr": 0.02737987122994324
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.44242424242424244,
"acc_stderr": 0.03878372113711275,
"acc_norm": 0.44242424242424244,
"acc_norm_stderr": 0.03878372113711275
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.30303030303030304,
"acc_stderr": 0.03274287914026868,
"acc_norm": 0.30303030303030304,
"acc_norm_stderr": 0.03274287914026868
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3160621761658031,
"acc_stderr": 0.03355397369686174,
"acc_norm": 0.3160621761658031,
"acc_norm_stderr": 0.03355397369686174
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.021763733684173933,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.021763733684173933
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22592592592592592,
"acc_stderr": 0.02549753263960955,
"acc_norm": 0.22592592592592592,
"acc_norm_stderr": 0.02549753263960955
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23949579831932774,
"acc_stderr": 0.02772206549336127,
"acc_norm": 0.23949579831932774,
"acc_norm_stderr": 0.02772206549336127
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.03511807571804724,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.03511807571804724
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3504587155963303,
"acc_stderr": 0.020456077599824457,
"acc_norm": 0.3504587155963303,
"acc_norm_stderr": 0.020456077599824457
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.19444444444444445,
"acc_stderr": 0.02699145450203673,
"acc_norm": 0.19444444444444445,
"acc_norm_stderr": 0.02699145450203673
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.44607843137254904,
"acc_stderr": 0.03488845451304974,
"acc_norm": 0.44607843137254904,
"acc_norm_stderr": 0.03488845451304974
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5316455696202531,
"acc_stderr": 0.032481974005110756,
"acc_norm": 0.5316455696202531,
"acc_norm_stderr": 0.032481974005110756
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.38565022421524664,
"acc_stderr": 0.03266842214289202,
"acc_norm": 0.38565022421524664,
"acc_norm_stderr": 0.03266842214289202
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.33587786259541985,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.33587786259541985,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.36363636363636365,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.36363636363636365,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.33980582524271846,
"acc_stderr": 0.04689765937278131,
"acc_norm": 0.33980582524271846,
"acc_norm_stderr": 0.04689765937278131
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.4658119658119658,
"acc_stderr": 0.03267942734081228,
"acc_norm": 0.4658119658119658,
"acc_norm_stderr": 0.03267942734081228
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.47381864623243936,
"acc_stderr": 0.017855434554041986,
"acc_norm": 0.47381864623243936,
"acc_norm_stderr": 0.017855434554041986
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3179190751445087,
"acc_stderr": 0.025070713719153183,
"acc_norm": 0.3179190751445087,
"acc_norm_stderr": 0.025070713719153183
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.30718954248366015,
"acc_stderr": 0.026415601914388995,
"acc_norm": 0.30718954248366015,
"acc_norm_stderr": 0.026415601914388995
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.31189710610932475,
"acc_stderr": 0.02631185807185416,
"acc_norm": 0.31189710610932475,
"acc_norm_stderr": 0.02631185807185416
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4104938271604938,
"acc_stderr": 0.027371350925124764,
"acc_norm": 0.4104938271604938,
"acc_norm_stderr": 0.027371350925124764
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.025770015644290396,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.025770015644290396
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.28878748370273793,
"acc_stderr": 0.011574914757219962,
"acc_norm": 0.28878748370273793,
"acc_norm_stderr": 0.011574914757219962
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.02841820861940679,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.02841820861940679
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.34477124183006536,
"acc_stderr": 0.01922832201869664,
"acc_norm": 0.34477124183006536,
"acc_norm_stderr": 0.01922832201869664
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4090909090909091,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.4090909090909091,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3469387755102041,
"acc_stderr": 0.0304725260267265,
"acc_norm": 0.3469387755102041,
"acc_norm_stderr": 0.0304725260267265
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.31343283582089554,
"acc_stderr": 0.032801882053486435,
"acc_norm": 0.31343283582089554,
"acc_norm_stderr": 0.032801882053486435
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-virology|5": {
"acc": 0.37349397590361444,
"acc_stderr": 0.037658451171688624,
"acc_norm": 0.37349397590361444,
"acc_norm_stderr": 0.037658451171688624
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4678362573099415,
"acc_stderr": 0.03826882417660368,
"acc_norm": 0.4678362573099415,
"acc_norm_stderr": 0.03826882417660368
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30354957160342716,
"mc1_stderr": 0.016095884155386847,
"mc2": 0.4937561621069656,
"mc2_stderr": 0.016106089320397136
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/tokugawa_matsuri_theidolmstermillionlive | 2023-09-17T17:42:39.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of tokugawa_matsuri (THE iDOLM@STER: Million Live!)
This is the dataset of tokugawa_matsuri (THE iDOLM@STER: Million Live!), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 524 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 524 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 524 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 524 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_elliotthwang__Elliott-Chinese-LLaMa-GPTQ-V2.0 | 2023-09-16T11:14:35.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of elliotthwang/Elliott-Chinese-LLaMa-GPTQ-V2.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [elliotthwang/Elliott-Chinese-LLaMa-GPTQ-V2.0](https://huggingface.co/elliotthwang/Elliott-Chinese-LLaMa-GPTQ-V2.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_elliotthwang__Elliott-Chinese-LLaMa-GPTQ-V2.0\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-16T11:13:20.345757](https://huggingface.co/datasets/open-llm-leaderboard/details_elliotthwang__Elliott-Chinese-LLaMa-GPTQ-V2.0/blob/main/results_2023-09-16T11-13-20.345757.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4948662611548734,\n\
\ \"acc_stderr\": 0.03510121450482972,\n \"acc_norm\": 0.4987437024013202,\n\
\ \"acc_norm_stderr\": 0.03509047386923906,\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.015905987048184828,\n \"mc2\": 0.44695776879520727,\n\
\ \"mc2_stderr\": 0.014691595442781428\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4726962457337884,\n \"acc_stderr\": 0.014589589101985996,\n\
\ \"acc_norm\": 0.507679180887372,\n \"acc_norm_stderr\": 0.01460966744089257\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5598486357299343,\n\
\ \"acc_stderr\": 0.004953907062096598,\n \"acc_norm\": 0.7536347341167098,\n\
\ \"acc_norm_stderr\": 0.004300131223340694\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5197368421052632,\n \"acc_stderr\": 0.04065771002562605,\n\
\ \"acc_norm\": 0.5197368421052632,\n \"acc_norm_stderr\": 0.04065771002562605\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.539622641509434,\n \"acc_stderr\": 0.030676096599389184,\n\
\ \"acc_norm\": 0.539622641509434,\n \"acc_norm_stderr\": 0.030676096599389184\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5138888888888888,\n\
\ \"acc_stderr\": 0.04179596617581,\n \"acc_norm\": 0.5138888888888888,\n\
\ \"acc_norm_stderr\": 0.04179596617581\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4508670520231214,\n\
\ \"acc_stderr\": 0.037940126746970296,\n \"acc_norm\": 0.4508670520231214,\n\
\ \"acc_norm_stderr\": 0.037940126746970296\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.042801058373643966,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.042801058373643966\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3872340425531915,\n \"acc_stderr\": 0.03184389265339525,\n\
\ \"acc_norm\": 0.3872340425531915,\n \"acc_norm_stderr\": 0.03184389265339525\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.04372748290278007,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.04372748290278007\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.28835978835978837,\n \"acc_stderr\": 0.023330654054535892,\n \"\
acc_norm\": 0.28835978835978837,\n \"acc_norm_stderr\": 0.023330654054535892\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5645161290322581,\n\
\ \"acc_stderr\": 0.028206225591502734,\n \"acc_norm\": 0.5645161290322581,\n\
\ \"acc_norm_stderr\": 0.028206225591502734\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3251231527093596,\n \"acc_stderr\": 0.032957975663112704,\n\
\ \"acc_norm\": 0.3251231527093596,\n \"acc_norm_stderr\": 0.032957975663112704\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n\
\ \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6363636363636364,\n \"acc_stderr\": 0.03427308652999934,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.03427308652999934\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7150259067357513,\n \"acc_stderr\": 0.032577140777096614,\n\
\ \"acc_norm\": 0.7150259067357513,\n \"acc_norm_stderr\": 0.032577140777096614\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.44358974358974357,\n \"acc_stderr\": 0.025189149894764194,\n\
\ \"acc_norm\": 0.44358974358974357,\n \"acc_norm_stderr\": 0.025189149894764194\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.22592592592592592,\n \"acc_stderr\": 0.02549753263960955,\n \
\ \"acc_norm\": 0.22592592592592592,\n \"acc_norm_stderr\": 0.02549753263960955\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4327731092436975,\n \"acc_stderr\": 0.032183581077426124,\n\
\ \"acc_norm\": 0.4327731092436975,\n \"acc_norm_stderr\": 0.032183581077426124\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6788990825688074,\n \"acc_stderr\": 0.02001814977273375,\n \"\
acc_norm\": 0.6788990825688074,\n \"acc_norm_stderr\": 0.02001814977273375\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3472222222222222,\n \"acc_stderr\": 0.032468872436376486,\n \"\
acc_norm\": 0.3472222222222222,\n \"acc_norm_stderr\": 0.032468872436376486\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6323529411764706,\n \"acc_stderr\": 0.03384132045674118,\n \"\
acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.03384132045674118\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6708860759493671,\n \"acc_stderr\": 0.030587326294702368,\n \
\ \"acc_norm\": 0.6708860759493671,\n \"acc_norm_stderr\": 0.030587326294702368\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5874439461883408,\n\
\ \"acc_stderr\": 0.03304062175449297,\n \"acc_norm\": 0.5874439461883408,\n\
\ \"acc_norm_stderr\": 0.03304062175449297\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.0418644516301375,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.0418644516301375\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.043457245702925335,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.043457245702925335\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5648148148148148,\n\
\ \"acc_stderr\": 0.04792898170907061,\n \"acc_norm\": 0.5648148148148148,\n\
\ \"acc_norm_stderr\": 0.04792898170907061\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5460122699386503,\n \"acc_stderr\": 0.0391170190467718,\n\
\ \"acc_norm\": 0.5460122699386503,\n \"acc_norm_stderr\": 0.0391170190467718\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n\
\ \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7521367521367521,\n\
\ \"acc_stderr\": 0.028286324075564393,\n \"acc_norm\": 0.7521367521367521,\n\
\ \"acc_norm_stderr\": 0.028286324075564393\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6896551724137931,\n\
\ \"acc_stderr\": 0.016543785026048308,\n \"acc_norm\": 0.6896551724137931,\n\
\ \"acc_norm_stderr\": 0.016543785026048308\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5057803468208093,\n \"acc_stderr\": 0.026917296179149123,\n\
\ \"acc_norm\": 0.5057803468208093,\n \"acc_norm_stderr\": 0.026917296179149123\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2737430167597765,\n\
\ \"acc_stderr\": 0.014912413096372434,\n \"acc_norm\": 0.2737430167597765,\n\
\ \"acc_norm_stderr\": 0.014912413096372434\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.028452639985088006,\n\
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.028452639985088006\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5627009646302251,\n\
\ \"acc_stderr\": 0.028173917761762902,\n \"acc_norm\": 0.5627009646302251,\n\
\ \"acc_norm_stderr\": 0.028173917761762902\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5679012345679012,\n \"acc_stderr\": 0.02756301097160668,\n\
\ \"acc_norm\": 0.5679012345679012,\n \"acc_norm_stderr\": 0.02756301097160668\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.35815602836879434,\n \"acc_stderr\": 0.02860208586275942,\n \
\ \"acc_norm\": 0.35815602836879434,\n \"acc_norm_stderr\": 0.02860208586275942\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.34419817470664926,\n\
\ \"acc_stderr\": 0.012134433741002574,\n \"acc_norm\": 0.34419817470664926,\n\
\ \"acc_norm_stderr\": 0.012134433741002574\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.03035969707904612,\n\
\ \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.03035969707904612\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.46568627450980393,\n \"acc_stderr\": 0.020180144843307296,\n \
\ \"acc_norm\": 0.46568627450980393,\n \"acc_norm_stderr\": 0.020180144843307296\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04789131426105757,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04789131426105757\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5102040816326531,\n \"acc_stderr\": 0.03200255347893783,\n\
\ \"acc_norm\": 0.5102040816326531,\n \"acc_norm_stderr\": 0.03200255347893783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6965174129353234,\n\
\ \"acc_stderr\": 0.032510068164586174,\n \"acc_norm\": 0.6965174129353234,\n\
\ \"acc_norm_stderr\": 0.032510068164586174\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7426900584795322,\n \"acc_stderr\": 0.03352799844161865,\n\
\ \"acc_norm\": 0.7426900584795322,\n \"acc_norm_stderr\": 0.03352799844161865\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.015905987048184828,\n \"mc2\": 0.44695776879520727,\n\
\ \"mc2_stderr\": 0.014691595442781428\n }\n}\n```"
repo_url: https://huggingface.co/elliotthwang/Elliott-Chinese-LLaMa-GPTQ-V2.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|arc:challenge|25_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hellaswag|10_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-16T11-13-20.345757.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-16T11-13-20.345757.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-16T11-13-20.345757.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-16T11-13-20.345757.parquet'
- config_name: results
data_files:
- split: 2023_09_16T11_13_20.345757
path:
- results_2023-09-16T11-13-20.345757.parquet
- split: latest
path:
- results_2023-09-16T11-13-20.345757.parquet
---
# Dataset Card for Evaluation run of elliotthwang/Elliott-Chinese-LLaMa-GPTQ-V2.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/elliotthwang/Elliott-Chinese-LLaMa-GPTQ-V2.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [elliotthwang/Elliott-Chinese-LLaMa-GPTQ-V2.0](https://huggingface.co/elliotthwang/Elliott-Chinese-LLaMa-GPTQ-V2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_elliotthwang__Elliott-Chinese-LLaMa-GPTQ-V2.0",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T11:13:20.345757](https://huggingface.co/datasets/open-llm-leaderboard/details_elliotthwang__Elliott-Chinese-LLaMa-GPTQ-V2.0/blob/main/results_2023-09-16T11-13-20.345757.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4948662611548734,
"acc_stderr": 0.03510121450482972,
"acc_norm": 0.4987437024013202,
"acc_norm_stderr": 0.03509047386923906,
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184828,
"mc2": 0.44695776879520727,
"mc2_stderr": 0.014691595442781428
},
"harness|arc:challenge|25": {
"acc": 0.4726962457337884,
"acc_stderr": 0.014589589101985996,
"acc_norm": 0.507679180887372,
"acc_norm_stderr": 0.01460966744089257
},
"harness|hellaswag|10": {
"acc": 0.5598486357299343,
"acc_stderr": 0.004953907062096598,
"acc_norm": 0.7536347341167098,
"acc_norm_stderr": 0.004300131223340694
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5197368421052632,
"acc_stderr": 0.04065771002562605,
"acc_norm": 0.5197368421052632,
"acc_norm_stderr": 0.04065771002562605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.539622641509434,
"acc_stderr": 0.030676096599389184,
"acc_norm": 0.539622641509434,
"acc_norm_stderr": 0.030676096599389184
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.04179596617581,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.04179596617581
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4508670520231214,
"acc_stderr": 0.037940126746970296,
"acc_norm": 0.4508670520231214,
"acc_norm_stderr": 0.037940126746970296
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.042801058373643966,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.042801058373643966
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3872340425531915,
"acc_stderr": 0.03184389265339525,
"acc_norm": 0.3872340425531915,
"acc_norm_stderr": 0.03184389265339525
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.04372748290278007,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.04372748290278007
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.28835978835978837,
"acc_stderr": 0.023330654054535892,
"acc_norm": 0.28835978835978837,
"acc_norm_stderr": 0.023330654054535892
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5645161290322581,
"acc_stderr": 0.028206225591502734,
"acc_norm": 0.5645161290322581,
"acc_norm_stderr": 0.028206225591502734
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3251231527093596,
"acc_stderr": 0.032957975663112704,
"acc_norm": 0.3251231527093596,
"acc_norm_stderr": 0.032957975663112704
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6484848484848484,
"acc_stderr": 0.037282069986826503,
"acc_norm": 0.6484848484848484,
"acc_norm_stderr": 0.037282069986826503
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.03427308652999934,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.03427308652999934
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7150259067357513,
"acc_stderr": 0.032577140777096614,
"acc_norm": 0.7150259067357513,
"acc_norm_stderr": 0.032577140777096614
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.44358974358974357,
"acc_stderr": 0.025189149894764194,
"acc_norm": 0.44358974358974357,
"acc_norm_stderr": 0.025189149894764194
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22592592592592592,
"acc_stderr": 0.02549753263960955,
"acc_norm": 0.22592592592592592,
"acc_norm_stderr": 0.02549753263960955
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4327731092436975,
"acc_stderr": 0.032183581077426124,
"acc_norm": 0.4327731092436975,
"acc_norm_stderr": 0.032183581077426124
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6788990825688074,
"acc_stderr": 0.02001814977273375,
"acc_norm": 0.6788990825688074,
"acc_norm_stderr": 0.02001814977273375
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.032468872436376486,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.032468872436376486
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.03384132045674118,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.03384132045674118
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6708860759493671,
"acc_stderr": 0.030587326294702368,
"acc_norm": 0.6708860759493671,
"acc_norm_stderr": 0.030587326294702368
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5874439461883408,
"acc_stderr": 0.03304062175449297,
"acc_norm": 0.5874439461883408,
"acc_norm_stderr": 0.03304062175449297
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.0418644516301375,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.0418644516301375
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.043457245702925335,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.043457245702925335
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.04792898170907061,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.04792898170907061
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5460122699386503,
"acc_stderr": 0.0391170190467718,
"acc_norm": 0.5460122699386503,
"acc_norm_stderr": 0.0391170190467718
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285714,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285714
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7521367521367521,
"acc_stderr": 0.028286324075564393,
"acc_norm": 0.7521367521367521,
"acc_norm_stderr": 0.028286324075564393
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6896551724137931,
"acc_stderr": 0.016543785026048308,
"acc_norm": 0.6896551724137931,
"acc_norm_stderr": 0.016543785026048308
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5057803468208093,
"acc_stderr": 0.026917296179149123,
"acc_norm": 0.5057803468208093,
"acc_norm_stderr": 0.026917296179149123
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2737430167597765,
"acc_stderr": 0.014912413096372434,
"acc_norm": 0.2737430167597765,
"acc_norm_stderr": 0.014912413096372434
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.028452639985088006,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.028452639985088006
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5627009646302251,
"acc_stderr": 0.028173917761762902,
"acc_norm": 0.5627009646302251,
"acc_norm_stderr": 0.028173917761762902
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5679012345679012,
"acc_stderr": 0.02756301097160668,
"acc_norm": 0.5679012345679012,
"acc_norm_stderr": 0.02756301097160668
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35815602836879434,
"acc_stderr": 0.02860208586275942,
"acc_norm": 0.35815602836879434,
"acc_norm_stderr": 0.02860208586275942
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.34419817470664926,
"acc_stderr": 0.012134433741002574,
"acc_norm": 0.34419817470664926,
"acc_norm_stderr": 0.012134433741002574
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.03035969707904612,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.03035969707904612
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.46568627450980393,
"acc_stderr": 0.020180144843307296,
"acc_norm": 0.46568627450980393,
"acc_norm_stderr": 0.020180144843307296
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5,
"acc_stderr": 0.04789131426105757,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04789131426105757
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5102040816326531,
"acc_stderr": 0.03200255347893783,
"acc_norm": 0.5102040816326531,
"acc_norm_stderr": 0.03200255347893783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6965174129353234,
"acc_stderr": 0.032510068164586174,
"acc_norm": 0.6965174129353234,
"acc_norm_stderr": 0.032510068164586174
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7426900584795322,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.7426900584795322,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184828,
"mc2": 0.44695776879520727,
"mc2_stderr": 0.014691595442781428
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
BangumiBase/toarukagakunorailgun | 2023-09-29T08:19:15.000Z | [
"size_categories:10K<n<100K",
"license:mit",
"art",
"region:us"
] | BangumiBase | null | null | null | 0 | 0 | ---
license: mit
tags:
- art
size_categories:
- 10K<n<100K
---
# Bangumi Image Base of Toaru Kagaku No Railgun
This is the image base of bangumi Toaru Kagaku no Railgun, we detected 165 characters, 18219 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:----------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|
| 0 | 82 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 145 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 24 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 1375 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 14 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 38 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 83 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 27 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 38 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 40 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 4044 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 94 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 65 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 38 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 18 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 38 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 44 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 38 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 43 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 118 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 217 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 98 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 56 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 136 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 31 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 33 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 46 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 421 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 86 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 116 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 27 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 126 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 85 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 316 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 102 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 19 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 38 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 75 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 12 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 28 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 82 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 41 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 42 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 61 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 42 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 37 | [Download](45/dataset.zip) |  |  |  |  |  |  |  |  |
| 46 | 24 | [Download](46/dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 70 | [Download](47/dataset.zip) |  |  |  |  |  |  |  |  |
| 48 | 154 | [Download](48/dataset.zip) |  |  |  |  |  |  |  |  |
| 49 | 82 | [Download](49/dataset.zip) |  |  |  |  |  |  |  |  |
| 50 | 61 | [Download](50/dataset.zip) |  |  |  |  |  |  |  |  |
| 51 | 102 | [Download](51/dataset.zip) |  |  |  |  |  |  |  |  |
| 52 | 1930 | [Download](52/dataset.zip) |  |  |  |  |  |  |  |  |
| 53 | 45 | [Download](53/dataset.zip) |  |  |  |  |  |  |  |  |
| 54 | 21 | [Download](54/dataset.zip) |  |  |  |  |  |  |  |  |
| 55 | 35 | [Download](55/dataset.zip) |  |  |  |  |  |  |  |  |
| 56 | 36 | [Download](56/dataset.zip) |  |  |  |  |  |  |  |  |
| 57 | 35 | [Download](57/dataset.zip) |  |  |  |  |  |  |  |  |
| 58 | 51 | [Download](58/dataset.zip) |  |  |  |  |  |  |  |  |
| 59 | 231 | [Download](59/dataset.zip) |  |  |  |  |  |  |  |  |
| 60 | 13 | [Download](60/dataset.zip) |  |  |  |  |  |  |  |  |
| 61 | 36 | [Download](61/dataset.zip) |  |  |  |  |  |  |  |  |
| 62 | 14 | [Download](62/dataset.zip) |  |  |  |  |  |  |  |  |
| 63 | 25 | [Download](63/dataset.zip) |  |  |  |  |  |  |  |  |
| 64 | 231 | [Download](64/dataset.zip) |  |  |  |  |  |  |  |  |
| 65 | 80 | [Download](65/dataset.zip) |  |  |  |  |  |  |  |  |
| 66 | 21 | [Download](66/dataset.zip) |  |  |  |  |  |  |  |  |
| 67 | 11 | [Download](67/dataset.zip) |  |  |  |  |  |  |  |  |
| 68 | 79 | [Download](68/dataset.zip) |  |  |  |  |  |  |  |  |
| 69 | 31 | [Download](69/dataset.zip) |  |  |  |  |  |  |  |  |
| 70 | 33 | [Download](70/dataset.zip) |  |  |  |  |  |  |  |  |
| 71 | 18 | [Download](71/dataset.zip) |  |  |  |  |  |  |  |  |
| 72 | 356 | [Download](72/dataset.zip) |  |  |  |  |  |  |  |  |
| 73 | 23 | [Download](73/dataset.zip) |  |  |  |  |  |  |  |  |
| 74 | 16 | [Download](74/dataset.zip) |  |  |  |  |  |  |  |  |
| 75 | 16 | [Download](75/dataset.zip) |  |  |  |  |  |  |  |  |
| 76 | 27 | [Download](76/dataset.zip) |  |  |  |  |  |  |  |  |
| 77 | 25 | [Download](77/dataset.zip) |  |  |  |  |  |  |  |  |
| 78 | 18 | [Download](78/dataset.zip) |  |  |  |  |  |  |  |  |
| 79 | 12 | [Download](79/dataset.zip) |  |  |  |  |  |  |  |  |
| 80 | 1443 | [Download](80/dataset.zip) |  |  |  |  |  |  |  |  |
| 81 | 67 | [Download](81/dataset.zip) |  |  |  |  |  |  |  |  |
| 82 | 35 | [Download](82/dataset.zip) |  |  |  |  |  |  |  |  |
| 83 | 46 | [Download](83/dataset.zip) |  |  |  |  |  |  |  |  |
| 84 | 73 | [Download](84/dataset.zip) |  |  |  |  |  |  |  |  |
| 85 | 18 | [Download](85/dataset.zip) |  |  |  |  |  |  |  |  |
| 86 | 22 | [Download](86/dataset.zip) |  |  |  |  |  |  |  |  |
| 87 | 64 | [Download](87/dataset.zip) |  |  |  |  |  |  |  |  |
| 88 | 40 | [Download](88/dataset.zip) |  |  |  |  |  |  |  |  |
| 89 | 26 | [Download](89/dataset.zip) |  |  |  |  |  |  |  |  |
| 90 | 20 | [Download](90/dataset.zip) |  |  |  |  |  |  |  |  |
| 91 | 17 | [Download](91/dataset.zip) |  |  |  |  |  |  |  |  |
| 92 | 15 | [Download](92/dataset.zip) |  |  |  |  |  |  |  |  |
| 93 | 365 | [Download](93/dataset.zip) |  |  |  |  |  |  |  |  |
| 94 | 16 | [Download](94/dataset.zip) |  |  |  |  |  |  |  |  |
| 95 | 34 | [Download](95/dataset.zip) |  |  |  |  |  |  |  |  |
| 96 | 11 | [Download](96/dataset.zip) |  |  |  |  |  |  |  |  |
| 97 | 168 | [Download](97/dataset.zip) |  |  |  |  |  |  |  |  |
| 98 | 28 | [Download](98/dataset.zip) |  |  |  |  |  |  |  |  |
| 99 | 17 | [Download](99/dataset.zip) |  |  |  |  |  |  |  |  |
| 100 | 38 | [Download](100/dataset.zip) |  |  |  |  |  |  |  |  |
| 101 | 21 | [Download](101/dataset.zip) |  |  |  |  |  |  |  |  |
| 102 | 16 | [Download](102/dataset.zip) |  |  |  |  |  |  |  |  |
| 103 | 22 | [Download](103/dataset.zip) |  |  |  |  |  |  |  |  |
| 104 | 65 | [Download](104/dataset.zip) |  |  |  |  |  |  |  |  |
| 105 | 10 | [Download](105/dataset.zip) |  |  |  |  |  |  |  |  |
| 106 | 120 | [Download](106/dataset.zip) |  |  |  |  |  |  |  |  |
| 107 | 27 | [Download](107/dataset.zip) |  |  |  |  |  |  |  |  |
| 108 | 17 | [Download](108/dataset.zip) |  |  |  |  |  |  |  |  |
| 109 | 17 | [Download](109/dataset.zip) |  |  |  |  |  |  |  |  |
| 110 | 15 | [Download](110/dataset.zip) |  |  |  |  |  |  |  |  |
| 111 | 36 | [Download](111/dataset.zip) |  |  |  |  |  |  |  |  |
| 112 | 17 | [Download](112/dataset.zip) |  |  |  |  |  |  |  |  |
| 113 | 16 | [Download](113/dataset.zip) |  |  |  |  |  |  |  |  |
| 114 | 16 | [Download](114/dataset.zip) |  |  |  |  |  |  |  |  |
| 115 | 20 | [Download](115/dataset.zip) |  |  |  |  |  |  |  |  |
| 116 | 199 | [Download](116/dataset.zip) |  |  |  |  |  |  |  |  |
| 117 | 26 | [Download](117/dataset.zip) |  |  |  |  |  |  |  |  |
| 118 | 18 | [Download](118/dataset.zip) |  |  |  |  |  |  |  |  |
| 119 | 7 | [Download](119/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 120 | 149 | [Download](120/dataset.zip) |  |  |  |  |  |  |  |  |
| 121 | 41 | [Download](121/dataset.zip) |  |  |  |  |  |  |  |  |
| 122 | 40 | [Download](122/dataset.zip) |  |  |  |  |  |  |  |  |
| 123 | 16 | [Download](123/dataset.zip) |  |  |  |  |  |  |  |  |
| 124 | 67 | [Download](124/dataset.zip) |  |  |  |  |  |  |  |  |
| 125 | 12 | [Download](125/dataset.zip) |  |  |  |  |  |  |  |  |
| 126 | 40 | [Download](126/dataset.zip) |  |  |  |  |  |  |  |  |
| 127 | 15 | [Download](127/dataset.zip) |  |  |  |  |  |  |  |  |
| 128 | 9 | [Download](128/dataset.zip) |  |  |  |  |  |  |  |  |
| 129 | 15 | [Download](129/dataset.zip) |  |  |  |  |  |  |  |  |
| 130 | 14 | [Download](130/dataset.zip) |  |  |  |  |  |  |  |  |
| 131 | 50 | [Download](131/dataset.zip) |  |  |  |  |  |  |  |  |
| 132 | 8 | [Download](132/dataset.zip) |  |  |  |  |  |  |  |  |
| 133 | 18 | [Download](133/dataset.zip) |  |  |  |  |  |  |  |  |
| 134 | 57 | [Download](134/dataset.zip) |  |  |  |  |  |  |  |  |
| 135 | 28 | [Download](135/dataset.zip) |  |  |  |  |  |  |  |  |
| 136 | 13 | [Download](136/dataset.zip) |  |  |  |  |  |  |  |  |
| 137 | 47 | [Download](137/dataset.zip) |  |  |  |  |  |  |  |  |
| 138 | 16 | [Download](138/dataset.zip) |  |  |  |  |  |  |  |  |
| 139 | 12 | [Download](139/dataset.zip) |  |  |  |  |  |  |  |  |
| 140 | 12 | [Download](140/dataset.zip) |  |  |  |  |  |  |  |  |
| 141 | 19 | [Download](141/dataset.zip) |  |  |  |  |  |  |  |  |
| 142 | 423 | [Download](142/dataset.zip) |  |  |  |  |  |  |  |  |
| 143 | 20 | [Download](143/dataset.zip) |  |  |  |  |  |  |  |  |
| 144 | 15 | [Download](144/dataset.zip) |  |  |  |  |  |  |  |  |
| 145 | 154 | [Download](145/dataset.zip) |  |  |  |  |  |  |  |  |
| 146 | 53 | [Download](146/dataset.zip) |  |  |  |  |  |  |  |  |
| 147 | 14 | [Download](147/dataset.zip) |  |  |  |  |  |  |  |  |
| 148 | 13 | [Download](148/dataset.zip) |  |  |  |  |  |  |  |  |
| 149 | 115 | [Download](149/dataset.zip) |  |  |  |  |  |  |  |  |
| 150 | 35 | [Download](150/dataset.zip) |  |  |  |  |  |  |  |  |
| 151 | 41 | [Download](151/dataset.zip) |  |  |  |  |  |  |  |  |
| 152 | 12 | [Download](152/dataset.zip) |  |  |  |  |  |  |  |  |
| 153 | 17 | [Download](153/dataset.zip) |  |  |  |  |  |  |  |  |
| 154 | 13 | [Download](154/dataset.zip) |  |  |  |  |  |  |  |  |
| 155 | 14 | [Download](155/dataset.zip) |  |  |  |  |  |  |  |  |
| 156 | 88 | [Download](156/dataset.zip) |  |  |  |  |  |  |  |  |
| 157 | 13 | [Download](157/dataset.zip) |  |  |  |  |  |  |  |  |
| 158 | 9 | [Download](158/dataset.zip) |  |  |  |  |  |  |  |  |
| 159 | 13 | [Download](159/dataset.zip) |  |  |  |  |  |  |  |  |
| 160 | 6 | [Download](160/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 161 | 7 | [Download](161/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 162 | 67 | [Download](162/dataset.zip) |  |  |  |  |  |  |  |  |
| 163 | 5 | [Download](163/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| noise | 385 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
mirshad7/NERDS360 | 2023-09-18T15:35:29.000Z | [
"license:cc-by-nc-4.0",
"arxiv:2308.12967",
"region:us"
] | mirshad7 | null | null | null | 0 | 0 | ---
license: cc-by-nc-4.0
---
# NEO 360: Neural Fields for Sparse View Synthesis of Outdoor Scenes
[](https://opensource.org/licenses/MIT)<img src="demo/Pytorch_logo.png" width="10%">
This repository is the pytorch implementation of our paper:
<a href="https://www.tri.global/" target="_blank">
<img align="right" src="demo/tri-logo.png" width="25%"/>
</a>
**NEO 360: Neural Fields for Sparse View Synthesis of Outdoor Scenes**<br>
[__***Muhammad Zubair Irshad***__](https://zubairirshad.com), [Sergey Zakharov](https://zakharos.github.io/), [Katherine Liu](https://www.thekatherineliu.com/), [Vitor Guizilini](https://www.linkedin.com/in/vitorguizilini), [Thomas Kollar](http://www.tkollar.com/site/), [Adrien Gaidon](https://adriengaidon.com/), [Zsolt Kira](https://faculty.cc.gatech.edu/~zk15/), [Rares Ambrus](https://www.tri.global/about-us/dr-rares-ambrus) <br>
International Conference on Computer Vision (ICCV), 2023<br>
[[Project Page](https://zubair-irshad.github.io/projects/neo360.html)] [[arXiv](https://arxiv.org/abs/2308.12967)] [[PDF](https://arxiv.org/pdf/2308.12967.pdf)] [[Video](https://youtu.be/avmylyL_V8c?si=eeTPhl0xJxM3fSF7)]
<p align="center">
<img src="demo/NEO_Website_1.jpg" width="100%">
</p>
<p align="center">
<img src="demo/NEO_Architecture.JPG" width="100%">
</p>
### Code Coming Soon!
## 📊 Dataset
### NERDS 360 Multi-View dataset for Outdoor Scenes
NeRDS 360: "NeRF for Reconstruction, Decomposition and Scene Synthesis of 360° outdoor scenes” dataset comprising 75 unbounded scenes with full multi-view annotations and diverse scenes for generalizable NeRF training and evaluation.
<p align="center">
<img src="demo/github_dataset.gif" width="100%">
</p>
#### Download the dataset:
* [NERDS360 Training Set](https://tri-ml-public.s3.amazonaws.com/github/neo360/datasets/PDMultiObjv6.tar.gz) - 75 Scenes (19.5 GB)
* [NERDS360 Test Set](https://tri-ml-public.s3.amazonaws.com/github/neo360/datasets/PD_v6_test.tar.gz) - 5 Scenes (2.1 GB)
#### Visualizing the dataset (Coming Soon):
We will release our visualization scripts to generate visualizations like below i.e. plot accumulated pointclouds, multi-view camera annotations etc.
<p align="center">
<img src="demo/cameras.gif" width="100%">
</p>
## Citation
If you find this repository or our NERDS 360 dataset useful, please consider citing:
```
@inproceedings{irshad2023neo360,
title={NeO 360: Neural Fields for Sparse View Synthesis of Outdoor Scenes},
author={Muhammad Zubair Irshad and Sergey Zakharov and Katherine Liu and Vitor Guizilini and Thomas Kollar and Adrien Gaidon and Zsolt Kira and Rares Ambrus},
journal={Interntaional Conference on Computer Vision (ICCV)},
year={2023},
url={https://arxiv.org/abs/2308.12967},
}
```
|
open-llm-leaderboard/details_NekoPunchBBB__Llama-2-13b-hf_Open-Platypus-QLoRA-multigpu | 2023-09-16T11:43:06.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of NekoPunchBBB/Llama-2-13b-hf_Open-Platypus-QLoRA-multigpu
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NekoPunchBBB/Llama-2-13b-hf_Open-Platypus-QLoRA-multigpu](https://huggingface.co/NekoPunchBBB/Llama-2-13b-hf_Open-Platypus-QLoRA-multigpu)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NekoPunchBBB__Llama-2-13b-hf_Open-Platypus-QLoRA-multigpu\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-16T11:41:48.010953](https://huggingface.co/datasets/open-llm-leaderboard/details_NekoPunchBBB__Llama-2-13b-hf_Open-Platypus-QLoRA-multigpu/blob/main/results_2023-09-16T11-41-48.010953.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5493598573437528,\n\
\ \"acc_stderr\": 0.03446142180618168,\n \"acc_norm\": 0.5534666125583615,\n\
\ \"acc_norm_stderr\": 0.03444149114363826,\n \"mc1\": 0.2974296205630355,\n\
\ \"mc1_stderr\": 0.016002651487361002,\n \"mc2\": 0.43809580607302434,\n\
\ \"mc2_stderr\": 0.014492518921297695\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5418088737201365,\n \"acc_stderr\": 0.014560220308714697,\n\
\ \"acc_norm\": 0.5750853242320819,\n \"acc_norm_stderr\": 0.014445698968520767\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6159131647082254,\n\
\ \"acc_stderr\": 0.004853845750392156,\n \"acc_norm\": 0.8249352718581956,\n\
\ \"acc_norm_stderr\": 0.0037924580005234323\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249033,\n\
\ \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249033\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6113207547169811,\n \"acc_stderr\": 0.030000485448675986,\n\
\ \"acc_norm\": 0.6113207547169811,\n \"acc_norm_stderr\": 0.030000485448675986\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.041227287076512825,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.041227287076512825\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n\
\ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.5202312138728323,\n\
\ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.04372748290278007,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.04372748290278007\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523853,\n \"\
acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523853\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.632258064516129,\n\
\ \"acc_stderr\": 0.02743086657997347,\n \"acc_norm\": 0.632258064516129,\n\
\ \"acc_norm_stderr\": 0.02743086657997347\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3891625615763547,\n \"acc_stderr\": 0.034304624161038716,\n\
\ \"acc_norm\": 0.3891625615763547,\n \"acc_norm_stderr\": 0.034304624161038716\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n\
\ \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7121212121212122,\n \"acc_stderr\": 0.03225883512300992,\n \"\
acc_norm\": 0.7121212121212122,\n \"acc_norm_stderr\": 0.03225883512300992\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7927461139896373,\n \"acc_stderr\": 0.02925282329180363,\n\
\ \"acc_norm\": 0.7927461139896373,\n \"acc_norm_stderr\": 0.02925282329180363\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.48717948717948717,\n \"acc_stderr\": 0.025342671293807257,\n\
\ \"acc_norm\": 0.48717948717948717,\n \"acc_norm_stderr\": 0.025342671293807257\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945277,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945277\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5210084033613446,\n \"acc_stderr\": 0.03244980849990029,\n \
\ \"acc_norm\": 0.5210084033613446,\n \"acc_norm_stderr\": 0.03244980849990029\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7431192660550459,\n \"acc_stderr\": 0.018732492928342465,\n \"\
acc_norm\": 0.7431192660550459,\n \"acc_norm_stderr\": 0.018732492928342465\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4027777777777778,\n \"acc_stderr\": 0.03344887382997867,\n \"\
acc_norm\": 0.4027777777777778,\n \"acc_norm_stderr\": 0.03344887382997867\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7046413502109705,\n \"acc_stderr\": 0.029696338713422882,\n \
\ \"acc_norm\": 0.7046413502109705,\n \"acc_norm_stderr\": 0.029696338713422882\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870254,\n\
\ \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870254\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908706,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908706\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6481481481481481,\n\
\ \"acc_stderr\": 0.04616631111801713,\n \"acc_norm\": 0.6481481481481481,\n\
\ \"acc_norm_stderr\": 0.04616631111801713\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6503067484662577,\n \"acc_stderr\": 0.03746668325470021,\n\
\ \"acc_norm\": 0.6503067484662577,\n \"acc_norm_stderr\": 0.03746668325470021\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04287858751340455,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04287858751340455\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7905982905982906,\n\
\ \"acc_stderr\": 0.02665569965392273,\n \"acc_norm\": 0.7905982905982906,\n\
\ \"acc_norm_stderr\": 0.02665569965392273\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7637292464878672,\n\
\ \"acc_stderr\": 0.015190473717037507,\n \"acc_norm\": 0.7637292464878672,\n\
\ \"acc_norm_stderr\": 0.015190473717037507\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.02607431485165708,\n\
\ \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.02607431485165708\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39329608938547483,\n\
\ \"acc_stderr\": 0.01633726869427012,\n \"acc_norm\": 0.39329608938547483,\n\
\ \"acc_norm_stderr\": 0.01633726869427012\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027914055510468008,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027914055510468008\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n\
\ \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n\
\ \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6234567901234568,\n \"acc_stderr\": 0.026959344518747787,\n\
\ \"acc_norm\": 0.6234567901234568,\n \"acc_norm_stderr\": 0.026959344518747787\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.39361702127659576,\n \"acc_stderr\": 0.029144544781596147,\n \
\ \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.029144544781596147\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39504563233376794,\n\
\ \"acc_stderr\": 0.012485727813251562,\n \"acc_norm\": 0.39504563233376794,\n\
\ \"acc_norm_stderr\": 0.012485727813251562\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.030332578094555026,\n\
\ \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.030332578094555026\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.565359477124183,\n \"acc_stderr\": 0.02005426920072646,\n \
\ \"acc_norm\": 0.565359477124183,\n \"acc_norm_stderr\": 0.02005426920072646\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5795918367346938,\n \"acc_stderr\": 0.03160106993449601,\n\
\ \"acc_norm\": 0.5795918367346938,\n \"acc_norm_stderr\": 0.03160106993449601\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n\
\ \"acc_stderr\": 0.030567675938916718,\n \"acc_norm\": 0.7512437810945274,\n\
\ \"acc_norm_stderr\": 0.030567675938916718\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7309941520467836,\n \"acc_stderr\": 0.03401052620104089,\n\
\ \"acc_norm\": 0.7309941520467836,\n \"acc_norm_stderr\": 0.03401052620104089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2974296205630355,\n\
\ \"mc1_stderr\": 0.016002651487361002,\n \"mc2\": 0.43809580607302434,\n\
\ \"mc2_stderr\": 0.014492518921297695\n }\n}\n```"
repo_url: https://huggingface.co/NekoPunchBBB/Llama-2-13b-hf_Open-Platypus-QLoRA-multigpu
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|arc:challenge|25_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hellaswag|10_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-16T11-41-48.010953.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-16T11-41-48.010953.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-16T11-41-48.010953.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-16T11-41-48.010953.parquet'
- config_name: results
data_files:
- split: 2023_09_16T11_41_48.010953
path:
- results_2023-09-16T11-41-48.010953.parquet
- split: latest
path:
- results_2023-09-16T11-41-48.010953.parquet
---
# Dataset Card for Evaluation run of NekoPunchBBB/Llama-2-13b-hf_Open-Platypus-QLoRA-multigpu
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NekoPunchBBB/Llama-2-13b-hf_Open-Platypus-QLoRA-multigpu
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NekoPunchBBB/Llama-2-13b-hf_Open-Platypus-QLoRA-multigpu](https://huggingface.co/NekoPunchBBB/Llama-2-13b-hf_Open-Platypus-QLoRA-multigpu) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NekoPunchBBB__Llama-2-13b-hf_Open-Platypus-QLoRA-multigpu",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T11:41:48.010953](https://huggingface.co/datasets/open-llm-leaderboard/details_NekoPunchBBB__Llama-2-13b-hf_Open-Platypus-QLoRA-multigpu/blob/main/results_2023-09-16T11-41-48.010953.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5493598573437528,
"acc_stderr": 0.03446142180618168,
"acc_norm": 0.5534666125583615,
"acc_norm_stderr": 0.03444149114363826,
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487361002,
"mc2": 0.43809580607302434,
"mc2_stderr": 0.014492518921297695
},
"harness|arc:challenge|25": {
"acc": 0.5418088737201365,
"acc_stderr": 0.014560220308714697,
"acc_norm": 0.5750853242320819,
"acc_norm_stderr": 0.014445698968520767
},
"harness|hellaswag|10": {
"acc": 0.6159131647082254,
"acc_stderr": 0.004853845750392156,
"acc_norm": 0.8249352718581956,
"acc_norm_stderr": 0.0037924580005234323
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5394736842105263,
"acc_stderr": 0.04056242252249033,
"acc_norm": 0.5394736842105263,
"acc_norm_stderr": 0.04056242252249033
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6113207547169811,
"acc_stderr": 0.030000485448675986,
"acc_norm": 0.6113207547169811,
"acc_norm_stderr": 0.030000485448675986
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.041227287076512825,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.041227287076512825
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.04372748290278007,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.04372748290278007
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.023809523809523853,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.023809523809523853
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574924,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574924
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.632258064516129,
"acc_stderr": 0.02743086657997347,
"acc_norm": 0.632258064516129,
"acc_norm_stderr": 0.02743086657997347
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3891625615763547,
"acc_stderr": 0.034304624161038716,
"acc_norm": 0.3891625615763547,
"acc_norm_stderr": 0.034304624161038716
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6484848484848484,
"acc_stderr": 0.037282069986826503,
"acc_norm": 0.6484848484848484,
"acc_norm_stderr": 0.037282069986826503
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7121212121212122,
"acc_stderr": 0.03225883512300992,
"acc_norm": 0.7121212121212122,
"acc_norm_stderr": 0.03225883512300992
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7927461139896373,
"acc_stderr": 0.02925282329180363,
"acc_norm": 0.7927461139896373,
"acc_norm_stderr": 0.02925282329180363
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.48717948717948717,
"acc_stderr": 0.025342671293807257,
"acc_norm": 0.48717948717948717,
"acc_norm_stderr": 0.025342671293807257
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945277,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945277
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5210084033613446,
"acc_stderr": 0.03244980849990029,
"acc_norm": 0.5210084033613446,
"acc_norm_stderr": 0.03244980849990029
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7431192660550459,
"acc_stderr": 0.018732492928342465,
"acc_norm": 0.7431192660550459,
"acc_norm_stderr": 0.018732492928342465
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4027777777777778,
"acc_stderr": 0.03344887382997867,
"acc_norm": 0.4027777777777778,
"acc_norm_stderr": 0.03344887382997867
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7046413502109705,
"acc_stderr": 0.029696338713422882,
"acc_norm": 0.7046413502109705,
"acc_norm_stderr": 0.029696338713422882
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870254,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870254
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908706,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908706
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.04616631111801713,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.04616631111801713
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6503067484662577,
"acc_stderr": 0.03746668325470021,
"acc_norm": 0.6503067484662577,
"acc_norm_stderr": 0.03746668325470021
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340455,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340455
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7905982905982906,
"acc_stderr": 0.02665569965392273,
"acc_norm": 0.7905982905982906,
"acc_norm_stderr": 0.02665569965392273
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7637292464878672,
"acc_stderr": 0.015190473717037507,
"acc_norm": 0.7637292464878672,
"acc_norm_stderr": 0.015190473717037507
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.02607431485165708,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.02607431485165708
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39329608938547483,
"acc_stderr": 0.01633726869427012,
"acc_norm": 0.39329608938547483,
"acc_norm_stderr": 0.01633726869427012
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.027914055510468008,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.027914055510468008
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6527331189710611,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.6527331189710611,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6234567901234568,
"acc_stderr": 0.026959344518747787,
"acc_norm": 0.6234567901234568,
"acc_norm_stderr": 0.026959344518747787
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.39361702127659576,
"acc_stderr": 0.029144544781596147,
"acc_norm": 0.39361702127659576,
"acc_norm_stderr": 0.029144544781596147
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39504563233376794,
"acc_stderr": 0.012485727813251562,
"acc_norm": 0.39504563233376794,
"acc_norm_stderr": 0.012485727813251562
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5257352941176471,
"acc_stderr": 0.030332578094555026,
"acc_norm": 0.5257352941176471,
"acc_norm_stderr": 0.030332578094555026
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.565359477124183,
"acc_stderr": 0.02005426920072646,
"acc_norm": 0.565359477124183,
"acc_norm_stderr": 0.02005426920072646
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5795918367346938,
"acc_stderr": 0.03160106993449601,
"acc_norm": 0.5795918367346938,
"acc_norm_stderr": 0.03160106993449601
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916718,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916718
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7309941520467836,
"acc_stderr": 0.03401052620104089,
"acc_norm": 0.7309941520467836,
"acc_norm_stderr": 0.03401052620104089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487361002,
"mc2": 0.43809580607302434,
"mc2_stderr": 0.014492518921297695
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Brouz__Slerpeno | 2023-09-16T11:44:21.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Brouz/Slerpeno
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Brouz/Slerpeno](https://huggingface.co/Brouz/Slerpeno) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Brouz__Slerpeno\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-16T11:43:04.528052](https://huggingface.co/datasets/open-llm-leaderboard/details_Brouz__Slerpeno/blob/main/results_2023-09-16T11-43-04.528052.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5692971303482038,\n\
\ \"acc_stderr\": 0.03431989312935944,\n \"acc_norm\": 0.5731333725014961,\n\
\ \"acc_norm_stderr\": 0.034297703806416606,\n \"mc1\": 0.3427172582619339,\n\
\ \"mc1_stderr\": 0.016614949385347036,\n \"mc2\": 0.4804790845275838,\n\
\ \"mc2_stderr\": 0.015355439729053656\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5870307167235495,\n \"acc_stderr\": 0.014388344935398326,\n\
\ \"acc_norm\": 0.6168941979522184,\n \"acc_norm_stderr\": 0.014206472661672877\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6444931288587931,\n\
\ \"acc_stderr\": 0.004776883632722613,\n \"acc_norm\": 0.8409679346743677,\n\
\ \"acc_norm_stderr\": 0.0036495858528211847\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411021,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411021\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5592105263157895,\n \"acc_stderr\": 0.04040311062490436,\n\
\ \"acc_norm\": 0.5592105263157895,\n \"acc_norm_stderr\": 0.04040311062490436\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.029832808114796,\n\
\ \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.029832808114796\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n\
\ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.6041666666666666,\n\
\ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n\
\ \"acc_stderr\": 0.037940126746970296,\n \"acc_norm\": 0.5491329479768786,\n\
\ \"acc_norm_stderr\": 0.037940126746970296\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4553191489361702,\n \"acc_stderr\": 0.03255525359340354,\n\
\ \"acc_norm\": 0.4553191489361702,\n \"acc_norm_stderr\": 0.03255525359340354\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.04372748290278007,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.04372748290278007\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31746031746031744,\n \"acc_stderr\": 0.023973861998992065,\n \"\
acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.023973861998992065\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6741935483870968,\n\
\ \"acc_stderr\": 0.02666201057856711,\n \"acc_norm\": 0.6741935483870968,\n\
\ \"acc_norm_stderr\": 0.02666201057856711\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.035025446508458714,\n\
\ \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.035025446508458714\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.03663974994391245,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.03663974994391245\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\"\
: 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8134715025906736,\n \"acc_stderr\": 0.028112091210117474,\n\
\ \"acc_norm\": 0.8134715025906736,\n \"acc_norm_stderr\": 0.028112091210117474\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5256410256410257,\n \"acc_stderr\": 0.025317649726448656,\n\
\ \"acc_norm\": 0.5256410256410257,\n \"acc_norm_stderr\": 0.025317649726448656\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5630252100840336,\n \"acc_stderr\": 0.032219436365661956,\n\
\ \"acc_norm\": 0.5630252100840336,\n \"acc_norm_stderr\": 0.032219436365661956\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7559633027522936,\n \"acc_stderr\": 0.01841528635141641,\n \"\
acc_norm\": 0.7559633027522936,\n \"acc_norm_stderr\": 0.01841528635141641\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643524,\n \"\
acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643524\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460302,\n \
\ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460302\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591207,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591207\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.03642914578292406,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.03642914578292406\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7905982905982906,\n\
\ \"acc_stderr\": 0.026655699653922737,\n \"acc_norm\": 0.7905982905982906,\n\
\ \"acc_norm_stderr\": 0.026655699653922737\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7701149425287356,\n\
\ \"acc_stderr\": 0.015046301846691807,\n \"acc_norm\": 0.7701149425287356,\n\
\ \"acc_norm_stderr\": 0.015046301846691807\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6445086705202312,\n \"acc_stderr\": 0.025770292082977254,\n\
\ \"acc_norm\": 0.6445086705202312,\n \"acc_norm_stderr\": 0.025770292082977254\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42569832402234636,\n\
\ \"acc_stderr\": 0.016536829648997102,\n \"acc_norm\": 0.42569832402234636,\n\
\ \"acc_norm_stderr\": 0.016536829648997102\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.027826109307283693,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.027826109307283693\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n\
\ \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n\
\ \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6419753086419753,\n \"acc_stderr\": 0.0266756119260371,\n\
\ \"acc_norm\": 0.6419753086419753,\n \"acc_norm_stderr\": 0.0266756119260371\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4148936170212766,\n \"acc_stderr\": 0.0293922365846125,\n \
\ \"acc_norm\": 0.4148936170212766,\n \"acc_norm_stderr\": 0.0293922365846125\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4165580182529335,\n\
\ \"acc_stderr\": 0.012591153245057387,\n \"acc_norm\": 0.4165580182529335,\n\
\ \"acc_norm_stderr\": 0.012591153245057387\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5551470588235294,\n \"acc_stderr\": 0.030187532060329383,\n\
\ \"acc_norm\": 0.5551470588235294,\n \"acc_norm_stderr\": 0.030187532060329383\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5702614379084967,\n \"acc_stderr\": 0.02002712278492854,\n \
\ \"acc_norm\": 0.5702614379084967,\n \"acc_norm_stderr\": 0.02002712278492854\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6204081632653061,\n \"acc_stderr\": 0.031067211262872475,\n\
\ \"acc_norm\": 0.6204081632653061,\n \"acc_norm_stderr\": 0.031067211262872475\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7611940298507462,\n\
\ \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.7611940298507462,\n\
\ \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3427172582619339,\n\
\ \"mc1_stderr\": 0.016614949385347036,\n \"mc2\": 0.4804790845275838,\n\
\ \"mc2_stderr\": 0.015355439729053656\n }\n}\n```"
repo_url: https://huggingface.co/Brouz/Slerpeno
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|arc:challenge|25_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hellaswag|10_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-16T11-43-04.528052.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-16T11-43-04.528052.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-16T11-43-04.528052.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-16T11-43-04.528052.parquet'
- config_name: results
data_files:
- split: 2023_09_16T11_43_04.528052
path:
- results_2023-09-16T11-43-04.528052.parquet
- split: latest
path:
- results_2023-09-16T11-43-04.528052.parquet
---
# Dataset Card for Evaluation run of Brouz/Slerpeno
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Brouz/Slerpeno
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Brouz/Slerpeno](https://huggingface.co/Brouz/Slerpeno) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Brouz__Slerpeno",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T11:43:04.528052](https://huggingface.co/datasets/open-llm-leaderboard/details_Brouz__Slerpeno/blob/main/results_2023-09-16T11-43-04.528052.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5692971303482038,
"acc_stderr": 0.03431989312935944,
"acc_norm": 0.5731333725014961,
"acc_norm_stderr": 0.034297703806416606,
"mc1": 0.3427172582619339,
"mc1_stderr": 0.016614949385347036,
"mc2": 0.4804790845275838,
"mc2_stderr": 0.015355439729053656
},
"harness|arc:challenge|25": {
"acc": 0.5870307167235495,
"acc_stderr": 0.014388344935398326,
"acc_norm": 0.6168941979522184,
"acc_norm_stderr": 0.014206472661672877
},
"harness|hellaswag|10": {
"acc": 0.6444931288587931,
"acc_stderr": 0.004776883632722613,
"acc_norm": 0.8409679346743677,
"acc_norm_stderr": 0.0036495858528211847
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411021,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411021
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5592105263157895,
"acc_stderr": 0.04040311062490436,
"acc_norm": 0.5592105263157895,
"acc_norm_stderr": 0.04040311062490436
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6226415094339622,
"acc_stderr": 0.029832808114796,
"acc_norm": 0.6226415094339622,
"acc_norm_stderr": 0.029832808114796
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6041666666666666,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.6041666666666666,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.037940126746970296,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.037940126746970296
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4553191489361702,
"acc_stderr": 0.03255525359340354,
"acc_norm": 0.4553191489361702,
"acc_norm_stderr": 0.03255525359340354
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.04372748290278007,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.04372748290278007
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.023973861998992065,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.023973861998992065
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6741935483870968,
"acc_stderr": 0.02666201057856711,
"acc_norm": 0.6741935483870968,
"acc_norm_stderr": 0.02666201057856711
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.035025446508458714,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.035025446508458714
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.03663974994391245,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.03663974994391245
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8134715025906736,
"acc_stderr": 0.028112091210117474,
"acc_norm": 0.8134715025906736,
"acc_norm_stderr": 0.028112091210117474
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5256410256410257,
"acc_stderr": 0.025317649726448656,
"acc_norm": 0.5256410256410257,
"acc_norm_stderr": 0.025317649726448656
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911499,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911499
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5630252100840336,
"acc_stderr": 0.032219436365661956,
"acc_norm": 0.5630252100840336,
"acc_norm_stderr": 0.032219436365661956
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7559633027522936,
"acc_stderr": 0.01841528635141641,
"acc_norm": 0.7559633027522936,
"acc_norm_stderr": 0.01841528635141641
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643524,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643524
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.028458820991460302,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.028458820991460302
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.03642914578292406,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.03642914578292406
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7905982905982906,
"acc_stderr": 0.026655699653922737,
"acc_norm": 0.7905982905982906,
"acc_norm_stderr": 0.026655699653922737
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7701149425287356,
"acc_stderr": 0.015046301846691807,
"acc_norm": 0.7701149425287356,
"acc_norm_stderr": 0.015046301846691807
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6445086705202312,
"acc_stderr": 0.025770292082977254,
"acc_norm": 0.6445086705202312,
"acc_norm_stderr": 0.025770292082977254
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42569832402234636,
"acc_stderr": 0.016536829648997102,
"acc_norm": 0.42569832402234636,
"acc_norm_stderr": 0.016536829648997102
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.027826109307283693,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.027826109307283693
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6527331189710611,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.6527331189710611,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6419753086419753,
"acc_stderr": 0.0266756119260371,
"acc_norm": 0.6419753086419753,
"acc_norm_stderr": 0.0266756119260371
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4148936170212766,
"acc_stderr": 0.0293922365846125,
"acc_norm": 0.4148936170212766,
"acc_norm_stderr": 0.0293922365846125
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4165580182529335,
"acc_stderr": 0.012591153245057387,
"acc_norm": 0.4165580182529335,
"acc_norm_stderr": 0.012591153245057387
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5551470588235294,
"acc_stderr": 0.030187532060329383,
"acc_norm": 0.5551470588235294,
"acc_norm_stderr": 0.030187532060329383
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5702614379084967,
"acc_stderr": 0.02002712278492854,
"acc_norm": 0.5702614379084967,
"acc_norm_stderr": 0.02002712278492854
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6204081632653061,
"acc_stderr": 0.031067211262872475,
"acc_norm": 0.6204081632653061,
"acc_norm_stderr": 0.031067211262872475
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7611940298507462,
"acc_stderr": 0.03014777593540922,
"acc_norm": 0.7611940298507462,
"acc_norm_stderr": 0.03014777593540922
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3427172582619339,
"mc1_stderr": 0.016614949385347036,
"mc2": 0.4804790845275838,
"mc2_stderr": 0.015355439729053656
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
beatrixlayla22/adaajah151 | 2023-09-16T12:03:11.000Z | [
"region:us"
] | beatrixlayla22 | null | null | null | 0 | 0 | Entry not found |
CyberHarem/mochizuki_anna_theidolmstermillionlive | 2023-09-17T17:42:41.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mochizuki_anna (THE iDOLM@STER: Million Live!)
This is the dataset of mochizuki_anna (THE iDOLM@STER: Million Live!), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 543 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 543 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 543 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 543 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
bongo2112/harmonize-SDxl-openpose-output-images | 2023-09-16T17:37:41.000Z | [
"region:us"
] | bongo2112 | null | null | null | 0 | 0 | Entry not found |
CyberHarem/makabe_mizuki_theidolmstermillionlive | 2023-09-17T17:42:43.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of makabe_mizuki (THE iDOLM@STER: Million Live!)
This is the dataset of makabe_mizuki (THE iDOLM@STER: Million Live!), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 511 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 511 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 511 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 511 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
AndreyCheche/PWGood | 2023-09-16T13:00:53.000Z | [
"region:us"
] | AndreyCheche | null | null | null | 1 | 0 | Entry not found |
vitalossreview/Cannogen-VITALOSS-Reviews | 2023-09-16T13:05:45.000Z | [
"region:us"
] | vitalossreview | null | null | null | 0 | 0 | Entry not found |
bongo2112/mbosso-SDxl-openpose-output-images | 2023-09-16T17:42:09.000Z | [
"region:us"
] | bongo2112 | null | null | null | 0 | 0 | Entry not found |
7essen/sketchData | 2023-09-16T14:23:09.000Z | [
"language:en",
"region:us"
] | 7essen | null | null | null | 0 | 0 | ---
language:
- en
--- |
CyberHarem/shimabara_elena_theidolmstermillionlive | 2023-09-17T17:42:45.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of shimabara_elena (THE iDOLM@STER: Million Live!)
This is the dataset of shimabara_elena (THE iDOLM@STER: Million Live!), containing 140 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 140 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 372 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 140 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 140 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 140 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 140 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 140 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 372 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 372 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 372 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
sathayen/faiss_index | 2023-09-16T13:44:48.000Z | [
"region:us"
] | sathayen | null | null | null | 0 | 0 | Entry not found |
CyberHarem/kousaka_umi_theidolmstermillionlive | 2023-09-17T17:42:47.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kousaka_umi (THE iDOLM@STER: Million Live!)
This is the dataset of kousaka_umi (THE iDOLM@STER: Million Live!), containing 170 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 170 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 458 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 170 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 170 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 170 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 170 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 170 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 458 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 458 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 458 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_titan087__OpenLlama13B-Guanaco | 2023-09-16T14:03:36.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of titan087/OpenLlama13B-Guanaco
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [titan087/OpenLlama13B-Guanaco](https://huggingface.co/titan087/OpenLlama13B-Guanaco)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_titan087__OpenLlama13B-Guanaco\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-16T14:03:24.825840](https://huggingface.co/datasets/open-llm-leaderboard/details_titan087__OpenLlama13B-Guanaco/blob/main/results_2023-09-16T14-03-24.825840.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001153523489932886,\n\
\ \"em_stderr\": 0.0003476179896857114,\n \"f1\": 0.059600461409396,\n\
\ \"f1_stderr\": 0.0014119816542495496,\n \"acc\": 0.37350531632571854,\n\
\ \"acc_stderr\": 0.008659977992596098\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001153523489932886,\n \"em_stderr\": 0.0003476179896857114,\n\
\ \"f1\": 0.059600461409396,\n \"f1_stderr\": 0.0014119816542495496\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.029567854435178165,\n \
\ \"acc_stderr\": 0.004665893134220808\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7174427782162589,\n \"acc_stderr\": 0.012654062850971388\n\
\ }\n}\n```"
repo_url: https://huggingface.co/titan087/OpenLlama13B-Guanaco
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_09_16T14_03_24.825840
path:
- '**/details_harness|drop|3_2023-09-16T14-03-24.825840.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-16T14-03-24.825840.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_16T14_03_24.825840
path:
- '**/details_harness|gsm8k|5_2023-09-16T14-03-24.825840.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-16T14-03-24.825840.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_16T14_03_24.825840
path:
- '**/details_harness|winogrande|5_2023-09-16T14-03-24.825840.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-16T14-03-24.825840.parquet'
- config_name: results
data_files:
- split: 2023_09_16T14_03_24.825840
path:
- results_2023-09-16T14-03-24.825840.parquet
- split: latest
path:
- results_2023-09-16T14-03-24.825840.parquet
---
# Dataset Card for Evaluation run of titan087/OpenLlama13B-Guanaco
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/titan087/OpenLlama13B-Guanaco
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [titan087/OpenLlama13B-Guanaco](https://huggingface.co/titan087/OpenLlama13B-Guanaco) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_titan087__OpenLlama13B-Guanaco",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T14:03:24.825840](https://huggingface.co/datasets/open-llm-leaderboard/details_titan087__OpenLlama13B-Guanaco/blob/main/results_2023-09-16T14-03-24.825840.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001153523489932886,
"em_stderr": 0.0003476179896857114,
"f1": 0.059600461409396,
"f1_stderr": 0.0014119816542495496,
"acc": 0.37350531632571854,
"acc_stderr": 0.008659977992596098
},
"harness|drop|3": {
"em": 0.001153523489932886,
"em_stderr": 0.0003476179896857114,
"f1": 0.059600461409396,
"f1_stderr": 0.0014119816542495496
},
"harness|gsm8k|5": {
"acc": 0.029567854435178165,
"acc_stderr": 0.004665893134220808
},
"harness|winogrande|5": {
"acc": 0.7174427782162589,
"acc_stderr": 0.012654062850971388
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Rams901/sql-create-context-modified | 2023-09-16T14:11:35.000Z | [
"region:us"
] | Rams901 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: context
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1415326
num_examples: 3000
download_size: 632495
dataset_size: 1415326
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "sql-create-context-modified"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/yabuki_kana_theidolmstermillionlive | 2023-09-17T17:42:49.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of yabuki_kana (THE iDOLM@STER: Million Live!)
This is the dataset of yabuki_kana (THE iDOLM@STER: Million Live!), containing 79 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 79 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 210 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 79 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 79 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 79 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 79 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 79 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 210 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 210 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 210 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
kye/all-lucidrain-code-python-tokenized | 2023-09-16T14:26:09.000Z | [
"license:mit",
"region:us"
] | kye | null | null | null | 0 | 0 | ---
license: mit
---
|
kye/all-lucidrain-code-python-tokenized-8192 | 2023-09-21T01:40:03.000Z | [
"region:us"
] | kye | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: repo_name
sequence: string
- name: file_path
sequence: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 2299336
num_examples: 21
download_size: 349131
dataset_size: 2299336
---
# Dataset Card for "all-lucidrain-code-python-tokenized-8192"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ChenJianHao/CQU_DATA | 2023-09-16T14:41:23.000Z | [
"region:us"
] | ChenJianHao | null | null | null | 0 | 0 | Entry not found |
HLaci/RaftSub | 2023-09-18T13:03:43.000Z | [
"benchmark:raft",
"region:us"
] | HLaci | @InProceedings{huggingface:dataset,
title = {A great new dataset},
author={huggingface, Inc.
},
year={2020}
} | null | 0 | 0 | ---
benchmark: raft
type: prediction
submission_name: SetFitBase
---
# RAFT submissions for RaftSub
## Submitting to the leaderboard
To make a submission to the [leaderboard](https://huggingface.co/spaces/ought/raft-leaderboard), there are three main steps:
1. Generate predictions on the unlabeled test set of each task
2. Validate the predictions are compatible with the evaluation framework
3. Push the predictions to the Hub!
See the instructions below for more details.
### Rules
1. To prevent overfitting to the public leaderboard, we only evaluate **one submission per week**. You can push predictions to the Hub as many times as you wish, but we will only evaluate the most recent commit in a given week.
2. Transfer or meta-learning using other datasets, including further pre-training on other corpora, is allowed.
3. Use of unlabeled test data is allowed, as is it always available in the applied setting. For example, further pre-training using the unlabeled data for a task would be permitted.
4. Systems may be augmented with information retrieved from the internet, e.g. via automated web searches.
### Submission file format
For each task in RAFT, you should create a CSV file called `predictions.csv` with your model's predictions on the unlabeled test set. Each file should have exactly 2 columns:
* ID (int)
* Label (string)
See the dummy predictions in the `data` folder for examples with the expected format. Here is a simple example that creates a majority-class baseline:
```python
from pathlib import Path
import pandas as pd
from collections import Counter
from datasets import load_dataset, get_dataset_config_names
tasks = get_dataset_config_names("ought/raft")
for task in tasks:
# Load dataset
raft_subset = load_dataset("ought/raft", task)
# Compute majority class over training set
counter = Counter(raft_subset["train"]["Label"])
majority_class = counter.most_common(1)[0][0]
# Load predictions file
preds = pd.read_csv(f"data/{task}/predictions.csv")
# Convert label IDs to label names
preds["Label"] = raft_subset["train"].features["Label"].int2str(majority_class)
# Save predictions
preds.to_csv(f"data/{task}/predictions.csv", index=False)
```
As you can see in the example, each `predictions.csv` file should be stored in the task's subfolder in `data` and at the end you should have something like the following:
```
data
├── ade_corpus_v2
│ ├── predictions.csv
│ └── task.json
├── banking_77
│ ├── predictions.csv
│ └── task.json
├── neurips_impact_statement_risks
│ ├── predictions.csv
│ └── task.json
├── one_stop_english
│ ├── predictions.csv
│ └── task.json
├── overruling
│ ├── predictions.csv
│ └── task.json
├── semiconductor_org_types
│ ├── predictions.csv
│ └── task.json
├── systematic_review_inclusion
│ ├── predictions.csv
│ └── task.json
├── tai_safety_research
│ ├── predictions.csv
│ └── task.json
├── terms_of_service
│ ├── predictions.csv
│ └── task.json
├── tweet_eval_hate
│ ├── predictions.csv
│ └── task.json
└── twitter_complaints
├── predictions.csv
└── task.json
```
### Validate your submission
To ensure that your submission files are correctly formatted, run the following command from the root of the repository:
```
python cli.py validate
```
If everything is correct, you should see the following message:
```
All submission files validated! ✨ 🚀 ✨
Now you can make a submission 🤗
```
### Push your submission to the Hugging Face Hub!
The final step is to commit your files and push them to the Hub:
```
python cli.py submit
```
If there are no errors, you should see the following message:
```
Submission successful! 🎉 🥳 🎉
Your submission will be evaulated on Sunday 05 September 2021 ⏳
```
where the evaluation is run every Sunday and your results will be visible on the leaderboard. | |
keaneu/llama-2-7b-chat-hf | 2023-09-16T15:25:16.000Z | [
"region:us"
] | keaneu | null | null | null | 0 | 0 | Entry not found |
CyberHarem/nagayoshi_subaru_theidolmstermillionlive | 2023-09-17T17:42:51.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of nagayoshi_subaru (THE iDOLM@STER: Million Live!)
This is the dataset of nagayoshi_subaru (THE iDOLM@STER: Million Live!), containing 137 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 137 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 359 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 137 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 137 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 137 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 137 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 137 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 359 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 359 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 359 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
bindai/blond-training-data | 2023-09-16T15:30:12.000Z | [
"region:us"
] | bindai | null | null | null | 0 | 0 | Entry not found |
CyberHarem/kasuga_mirai_theidolmstermillionlive | 2023-09-17T17:42:54.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kasuga_mirai (THE iDOLM@STER: Million Live!)
This is the dataset of kasuga_mirai (THE iDOLM@STER: Million Live!), containing 180 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 180 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 482 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 180 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 180 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 180 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 180 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 180 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 482 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 482 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 482 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/julia_theidolmstermillionlive | 2023-09-17T17:42:56.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of julia (THE iDOLM@STER: Million Live!)
This is the dataset of julia (THE iDOLM@STER: Million Live!), containing 80 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 80 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 213 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 80 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 80 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 80 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 80 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 80 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 213 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 213 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 213 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
fmattera/pq_test2 | 2023-09-16T16:11:51.000Z | [
"region:us"
] | fmattera | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: file_name
dtype: string
- name: conditioning
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 158
num_examples: 1
download_size: 0
dataset_size: 158
---
# Dataset Card for "pq_test2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/takayama_sayoko_theidolmstermillionlive | 2023-09-17T17:42:58.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of takayama_sayoko (THE iDOLM@STER: Million Live!)
This is the dataset of takayama_sayoko (THE iDOLM@STER: Million Live!), containing 161 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 161 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 444 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 161 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 161 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 161 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 161 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 161 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 444 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 444 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 444 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
claoire/your-dataset-name | 2023-09-16T17:32:40.000Z | [
"region:us"
] | claoire | null | null | null | 0 | 0 | Entry not found |
CyberHarem/nakatani_iku_theidolmstermillionlive | 2023-09-17T17:43:00.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of nakatani_iku (THE iDOLM@STER: Million Live!)
This is the dataset of nakatani_iku (THE iDOLM@STER: Million Live!), containing 149 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 149 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 407 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 149 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 149 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 149 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 149 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 149 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 407 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 407 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 407 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
W3Genesis/srilankan_email_dataset | 2023-09-16T17:45:28.000Z | [
"language:en",
"license:agpl-3.0",
"region:us"
] | W3Genesis | null | null | null | 1 | 0 | ---
license: agpl-3.0
language:
- en
pretty_name: g
--- |
CyberHarem/yuna_kumakumakumabear | 2023-09-17T17:43:02.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Yuna
This is the dataset of Yuna, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 615 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 300 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 300 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 615 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 615 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 615 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_lvkaokao__llama2-7b-hf-chat-lora | 2023-09-16T18:19:20.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of lvkaokao/llama2-7b-hf-chat-lora
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lvkaokao/llama2-7b-hf-chat-lora](https://huggingface.co/lvkaokao/llama2-7b-hf-chat-lora)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lvkaokao__llama2-7b-hf-chat-lora\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-16T18:19:09.096561](https://huggingface.co/datasets/open-llm-leaderboard/details_lvkaokao__llama2-7b-hf-chat-lora/blob/main/results_2023-09-16T18-19-09.096561.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002307046979865772,\n\
\ \"em_stderr\": 0.0004913221265094556,\n \"f1\": 0.06527894295302021,\n\
\ \"f1_stderr\": 0.0014475102232856358,\n \"acc\": 0.433070962730968,\n\
\ \"acc_stderr\": 0.010283233892517613\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.002307046979865772,\n \"em_stderr\": 0.0004913221265094556,\n\
\ \"f1\": 0.06527894295302021,\n \"f1_stderr\": 0.0014475102232856358\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10765731614859743,\n \
\ \"acc_stderr\": 0.008537484003023352\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7584846093133386,\n \"acc_stderr\": 0.012028983782011874\n\
\ }\n}\n```"
repo_url: https://huggingface.co/lvkaokao/llama2-7b-hf-chat-lora
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_09_16T18_19_09.096561
path:
- '**/details_harness|drop|3_2023-09-16T18-19-09.096561.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-16T18-19-09.096561.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_16T18_19_09.096561
path:
- '**/details_harness|gsm8k|5_2023-09-16T18-19-09.096561.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-16T18-19-09.096561.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_16T18_19_09.096561
path:
- '**/details_harness|winogrande|5_2023-09-16T18-19-09.096561.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-16T18-19-09.096561.parquet'
- config_name: results
data_files:
- split: 2023_09_16T18_19_09.096561
path:
- results_2023-09-16T18-19-09.096561.parquet
- split: latest
path:
- results_2023-09-16T18-19-09.096561.parquet
---
# Dataset Card for Evaluation run of lvkaokao/llama2-7b-hf-chat-lora
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lvkaokao/llama2-7b-hf-chat-lora
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lvkaokao/llama2-7b-hf-chat-lora](https://huggingface.co/lvkaokao/llama2-7b-hf-chat-lora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lvkaokao__llama2-7b-hf-chat-lora",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T18:19:09.096561](https://huggingface.co/datasets/open-llm-leaderboard/details_lvkaokao__llama2-7b-hf-chat-lora/blob/main/results_2023-09-16T18-19-09.096561.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.002307046979865772,
"em_stderr": 0.0004913221265094556,
"f1": 0.06527894295302021,
"f1_stderr": 0.0014475102232856358,
"acc": 0.433070962730968,
"acc_stderr": 0.010283233892517613
},
"harness|drop|3": {
"em": 0.002307046979865772,
"em_stderr": 0.0004913221265094556,
"f1": 0.06527894295302021,
"f1_stderr": 0.0014475102232856358
},
"harness|gsm8k|5": {
"acc": 0.10765731614859743,
"acc_stderr": 0.008537484003023352
},
"harness|winogrande|5": {
"acc": 0.7584846093133386,
"acc_stderr": 0.012028983782011874
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Deepcord-AI/Chatdata | 2023-09-17T05:47:36.000Z | [
"size_categories:10M<n<100M",
"language:en",
"license:mit",
"deeeepio",
"deeeep.io",
"deepcord",
"region:us"
] | Deepcord-AI | null | null | null | 0 | 0 | ---
license: mit
language:
- en
tags:
- deeeepio
- deeeep.io
- deepcord
size_categories:
- 10M<n<100M
---
Deepcord chats exported into CSV files
Over 13M messages |
open-llm-leaderboard/details_MBZUAI__LaMini-GPT-124M | 2023-09-16T18:36:45.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of MBZUAI/LaMini-GPT-124M
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MBZUAI/LaMini-GPT-124M](https://huggingface.co/MBZUAI/LaMini-GPT-124M) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MBZUAI__LaMini-GPT-124M\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-16T18:36:34.459500](https://huggingface.co/datasets/open-llm-leaderboard/details_MBZUAI__LaMini-GPT-124M/blob/main/results_2023-09-16T18-36-34.459500.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.012269295302013422,\n\
\ \"em_stderr\": 0.0011273758781873528,\n \"f1\": 0.07700503355704716,\n\
\ \"f1_stderr\": 0.001885786848498622,\n \"acc\": 0.2569060773480663,\n\
\ \"acc_stderr\": 0.007023561458220208\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.012269295302013422,\n \"em_stderr\": 0.0011273758781873528,\n\
\ \"f1\": 0.07700503355704716,\n \"f1_stderr\": 0.001885786848498622\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5138121546961326,\n\
\ \"acc_stderr\": 0.014047122916440415\n }\n}\n```"
repo_url: https://huggingface.co/MBZUAI/LaMini-GPT-124M
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_09_16T18_36_34.459500
path:
- '**/details_harness|drop|3_2023-09-16T18-36-34.459500.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-16T18-36-34.459500.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_16T18_36_34.459500
path:
- '**/details_harness|gsm8k|5_2023-09-16T18-36-34.459500.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-16T18-36-34.459500.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_16T18_36_34.459500
path:
- '**/details_harness|winogrande|5_2023-09-16T18-36-34.459500.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-16T18-36-34.459500.parquet'
- config_name: results
data_files:
- split: 2023_09_16T18_36_34.459500
path:
- results_2023-09-16T18-36-34.459500.parquet
- split: latest
path:
- results_2023-09-16T18-36-34.459500.parquet
---
# Dataset Card for Evaluation run of MBZUAI/LaMini-GPT-124M
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/MBZUAI/LaMini-GPT-124M
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [MBZUAI/LaMini-GPT-124M](https://huggingface.co/MBZUAI/LaMini-GPT-124M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MBZUAI__LaMini-GPT-124M",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T18:36:34.459500](https://huggingface.co/datasets/open-llm-leaderboard/details_MBZUAI__LaMini-GPT-124M/blob/main/results_2023-09-16T18-36-34.459500.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.012269295302013422,
"em_stderr": 0.0011273758781873528,
"f1": 0.07700503355704716,
"f1_stderr": 0.001885786848498622,
"acc": 0.2569060773480663,
"acc_stderr": 0.007023561458220208
},
"harness|drop|3": {
"em": 0.012269295302013422,
"em_stderr": 0.0011273758781873528,
"f1": 0.07700503355704716,
"f1_stderr": 0.001885786848498622
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5138121546961326,
"acc_stderr": 0.014047122916440415
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
PlixTech/MicroGPT | 2023-09-16T18:44:40.000Z | [
"region:us"
] | PlixTech | null | null | null | 0 | 0 | Entry not found |
CyberHarem/fina_kumakumakumabear | 2023-09-17T17:43:04.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Fina
This is the dataset of Fina, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 609 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 300 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 300 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 609 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 609 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 609 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/oogami_tamaki_theidolmstermillionlive | 2023-09-17T17:43:06.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of oogami_tamaki (THE iDOLM@STER: Million Live!)
This is the dataset of oogami_tamaki (THE iDOLM@STER: Million Live!), containing 102 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 102 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 282 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 102 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 102 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 102 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 102 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 102 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 282 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 282 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 282 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
kimgaramisinnocentondiscord/Sakura_miyawaki | 2023-09-16T18:49:37.000Z | [
"license:openrail",
"region:us"
] | kimgaramisinnocentondiscord | null | null | null | 0 | 0 | ---
license: openrail
---
|
CyberHarem/kinoshita_hinata_theidolmstermillionlive | 2023-09-17T17:43:08.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kinoshita_hinata (THE iDOLM@STER: Million Live!)
This is the dataset of kinoshita_hinata (THE iDOLM@STER: Million Live!), containing 56 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 56 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 146 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 56 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 56 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 56 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 56 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 56 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 146 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 146 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 146 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/noire_foschurose_kumakumakumabear | 2023-09-17T17:43:10.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Noire Foschurose
This is the dataset of Noire Foschurose, containing 283 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 283 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 612 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 283 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 283 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 283 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 283 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 283 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 612 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 612 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 612 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/tokoro_megumi_theidolmstermillionlive | 2023-09-17T17:43:12.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of tokoro_megumi (THE iDOLM@STER: Million Live!)
This is the dataset of tokoro_megumi (THE iDOLM@STER: Million Live!), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 527 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 527 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 527 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 527 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/shuri_kumakumakumabear | 2023-09-17T17:43:15.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Shuri
This is the dataset of Shuri, containing 134 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 134 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 277 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 134 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 134 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 134 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 134 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 134 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 277 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 277 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 277 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_PocketDoc__Dans-PersonalityEngine-13b | 2023-09-16T19:32:49.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of PocketDoc/Dans-PersonalityEngine-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PocketDoc/Dans-PersonalityEngine-13b](https://huggingface.co/PocketDoc/Dans-PersonalityEngine-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PocketDoc__Dans-PersonalityEngine-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-16T19:32:36.390690](https://huggingface.co/datasets/open-llm-leaderboard/details_PocketDoc__Dans-PersonalityEngine-13b/blob/main/results_2023-09-16T19-32-36.390690.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0016778523489932886,\n\
\ \"em_stderr\": 0.00041913301788269345,\n \"f1\": 0.05738255033557058,\n\
\ \"f1_stderr\": 0.001309097903957112,\n \"acc\": 0.4341558294682836,\n\
\ \"acc_stderr\": 0.009872366201227655\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.00041913301788269345,\n\
\ \"f1\": 0.05738255033557058,\n \"f1_stderr\": 0.001309097903957112\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0932524639878696,\n \
\ \"acc_stderr\": 0.008009688838328578\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7750591949486977,\n \"acc_stderr\": 0.011735043564126732\n\
\ }\n}\n```"
repo_url: https://huggingface.co/PocketDoc/Dans-PersonalityEngine-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_09_16T19_32_36.390690
path:
- '**/details_harness|drop|3_2023-09-16T19-32-36.390690.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-16T19-32-36.390690.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_16T19_32_36.390690
path:
- '**/details_harness|gsm8k|5_2023-09-16T19-32-36.390690.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-16T19-32-36.390690.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_16T19_32_36.390690
path:
- '**/details_harness|winogrande|5_2023-09-16T19-32-36.390690.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-16T19-32-36.390690.parquet'
- config_name: results
data_files:
- split: 2023_09_16T19_32_36.390690
path:
- results_2023-09-16T19-32-36.390690.parquet
- split: latest
path:
- results_2023-09-16T19-32-36.390690.parquet
---
# Dataset Card for Evaluation run of PocketDoc/Dans-PersonalityEngine-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PocketDoc/Dans-PersonalityEngine-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PocketDoc/Dans-PersonalityEngine-13b](https://huggingface.co/PocketDoc/Dans-PersonalityEngine-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PocketDoc__Dans-PersonalityEngine-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T19:32:36.390690](https://huggingface.co/datasets/open-llm-leaderboard/details_PocketDoc__Dans-PersonalityEngine-13b/blob/main/results_2023-09-16T19-32-36.390690.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0016778523489932886,
"em_stderr": 0.00041913301788269345,
"f1": 0.05738255033557058,
"f1_stderr": 0.001309097903957112,
"acc": 0.4341558294682836,
"acc_stderr": 0.009872366201227655
},
"harness|drop|3": {
"em": 0.0016778523489932886,
"em_stderr": 0.00041913301788269345,
"f1": 0.05738255033557058,
"f1_stderr": 0.001309097903957112
},
"harness|gsm8k|5": {
"acc": 0.0932524639878696,
"acc_stderr": 0.008009688838328578
},
"harness|winogrande|5": {
"acc": 0.7750591949486977,
"acc_stderr": 0.011735043564126732
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/maihama_ayumu_theidolmstermillionlive | 2023-09-17T17:43:17.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of maihama_ayumu (THE iDOLM@STER: Million Live!)
This is the dataset of maihama_ayumu (THE iDOLM@STER: Million Live!), containing 69 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 69 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 189 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 69 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 69 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 69 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 69 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 69 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 189 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 189 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 189 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/misaana_farrengram_kumakumakumabear | 2023-09-17T17:43:19.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Misaana Farrengram
This is the dataset of Misaana Farrengram, containing 135 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 135 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 282 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 135 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 135 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 135 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 135 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 135 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 282 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 282 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 282 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/honda_roko_theidolmstermillionlive | 2023-09-17T17:43:21.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of honda_roko (THE iDOLM@STER: Million Live!)
This is the dataset of honda_roko (THE iDOLM@STER: Million Live!), containing 29 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 29 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 71 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 29 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 29 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 29 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 29 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 29 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 71 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 71 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 71 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/shiahuoshiyuroze_kumakumakumabear | 2023-09-17T17:43:23.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of シア・フォシュローゼ
This is the dataset of シア・フォシュローゼ, containing 201 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 201 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 480 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 201 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 201 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 201 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 201 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 201 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 480 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 480 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 480 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
realfolkcode/open-music-dataset-demo | 2023-09-16T20:20:33.000Z | [
"region:us"
] | realfolkcode | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: audio
dtype: audio
- name: caption
dtype: string
splits:
- name: train
num_bytes: 387155570.0
num_examples: 8
download_size: 386530208
dataset_size: 387155570.0
---
# Dataset Card for "open-music-dataset-demo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
typeof/TIGER-Lab-MathInstruct_PoT | 2023-09-16T20:33:18.000Z | [
"region:us"
] | typeof | null | null | null | 1 | 0 | SEE https://huggingface.co/datasets/TIGER-Lab/MathInstruct
This is only here for convenience |
CyberHarem/atora_kumakumakumabear | 2023-09-17T17:43:25.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of アトラ
This is the dataset of アトラ, containing 100 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 100 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 216 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 100 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 100 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 100 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 100 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 100 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 216 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 216 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 216 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/kitakami_reika_theidolmstermillionlive | 2023-09-17T17:43:27.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kitakami_reika (THE iDOLM@STER: Million Live!)
This is the dataset of kitakami_reika (THE iDOLM@STER: Million Live!), containing 189 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 189 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 500 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 189 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 189 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 189 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 189 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 189 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 500 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 500 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 500 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_circulus__Llama-2-7b-orca-v1 | 2023-09-16T21:26:47.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of circulus/Llama-2-7b-orca-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [circulus/Llama-2-7b-orca-v1](https://huggingface.co/circulus/Llama-2-7b-orca-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_circulus__Llama-2-7b-orca-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-16T21:26:35.463636](https://huggingface.co/datasets/open-llm-leaderboard/details_circulus__Llama-2-7b-orca-v1/blob/main/results_2023-09-16T21-26-35.463636.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.08557046979865772,\n\
\ \"em_stderr\": 0.0028646840549845006,\n \"f1\": 0.15811556208053656,\n\
\ \"f1_stderr\": 0.003126158993030364,\n \"acc\": 0.4151299715828343,\n\
\ \"acc_stderr\": 0.009762520250486784\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.08557046979865772,\n \"em_stderr\": 0.0028646840549845006,\n\
\ \"f1\": 0.15811556208053656,\n \"f1_stderr\": 0.003126158993030364\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07808946171341925,\n \
\ \"acc_stderr\": 0.007390654481108218\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7521704814522494,\n \"acc_stderr\": 0.01213438601986535\n\
\ }\n}\n```"
repo_url: https://huggingface.co/circulus/Llama-2-7b-orca-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_09_16T21_26_35.463636
path:
- '**/details_harness|drop|3_2023-09-16T21-26-35.463636.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-16T21-26-35.463636.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_16T21_26_35.463636
path:
- '**/details_harness|gsm8k|5_2023-09-16T21-26-35.463636.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-16T21-26-35.463636.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_16T21_26_35.463636
path:
- '**/details_harness|winogrande|5_2023-09-16T21-26-35.463636.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-16T21-26-35.463636.parquet'
- config_name: results
data_files:
- split: 2023_09_16T21_26_35.463636
path:
- results_2023-09-16T21-26-35.463636.parquet
- split: latest
path:
- results_2023-09-16T21-26-35.463636.parquet
---
# Dataset Card for Evaluation run of circulus/Llama-2-7b-orca-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/circulus/Llama-2-7b-orca-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [circulus/Llama-2-7b-orca-v1](https://huggingface.co/circulus/Llama-2-7b-orca-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_circulus__Llama-2-7b-orca-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T21:26:35.463636](https://huggingface.co/datasets/open-llm-leaderboard/details_circulus__Llama-2-7b-orca-v1/blob/main/results_2023-09-16T21-26-35.463636.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.08557046979865772,
"em_stderr": 0.0028646840549845006,
"f1": 0.15811556208053656,
"f1_stderr": 0.003126158993030364,
"acc": 0.4151299715828343,
"acc_stderr": 0.009762520250486784
},
"harness|drop|3": {
"em": 0.08557046979865772,
"em_stderr": 0.0028646840549845006,
"f1": 0.15811556208053656,
"f1_stderr": 0.003126158993030364
},
"harness|gsm8k|5": {
"acc": 0.07808946171341925,
"acc_stderr": 0.007390654481108218
},
"harness|winogrande|5": {
"acc": 0.7521704814522494,
"acc_stderr": 0.01213438601986535
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
safgasgfsa/ElonMusk | 2023-09-16T21:37:40.000Z | [
"region:us"
] | safgasgfsa | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_shaohang__Sparse0.5_OPT-1.3 | 2023-09-16T21:48:30.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of shaohang/Sparse0.5_OPT-1.3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [shaohang/Sparse0.5_OPT-1.3](https://huggingface.co/shaohang/Sparse0.5_OPT-1.3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_shaohang__Sparse0.5_OPT-1.3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-16T21:48:19.303713](https://huggingface.co/datasets/open-llm-leaderboard/details_shaohang__Sparse0.5_OPT-1.3/blob/main/results_2023-09-16T21-48-19.303713.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.003145973154362416,\n\
\ \"em_stderr\": 0.0005734993648436398,\n \"f1\": 0.047173867449664536,\n\
\ \"f1_stderr\": 0.0012666649528854216,\n \"acc\": 0.29319675461487227,\n\
\ \"acc_stderr\": 0.007301498172995543\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.003145973154362416,\n \"em_stderr\": 0.0005734993648436398,\n\
\ \"f1\": 0.047173867449664536,\n \"f1_stderr\": 0.0012666649528854216\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \
\ \"acc_stderr\": 0.0007581501137225237\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.585635359116022,\n \"acc_stderr\": 0.013844846232268563\n\
\ }\n}\n```"
repo_url: https://huggingface.co/shaohang/Sparse0.5_OPT-1.3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_09_16T21_48_19.303713
path:
- '**/details_harness|drop|3_2023-09-16T21-48-19.303713.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-16T21-48-19.303713.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_16T21_48_19.303713
path:
- '**/details_harness|gsm8k|5_2023-09-16T21-48-19.303713.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-16T21-48-19.303713.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_16T21_48_19.303713
path:
- '**/details_harness|winogrande|5_2023-09-16T21-48-19.303713.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-16T21-48-19.303713.parquet'
- config_name: results
data_files:
- split: 2023_09_16T21_48_19.303713
path:
- results_2023-09-16T21-48-19.303713.parquet
- split: latest
path:
- results_2023-09-16T21-48-19.303713.parquet
---
# Dataset Card for Evaluation run of shaohang/Sparse0.5_OPT-1.3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/shaohang/Sparse0.5_OPT-1.3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [shaohang/Sparse0.5_OPT-1.3](https://huggingface.co/shaohang/Sparse0.5_OPT-1.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_shaohang__Sparse0.5_OPT-1.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T21:48:19.303713](https://huggingface.co/datasets/open-llm-leaderboard/details_shaohang__Sparse0.5_OPT-1.3/blob/main/results_2023-09-16T21-48-19.303713.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.003145973154362416,
"em_stderr": 0.0005734993648436398,
"f1": 0.047173867449664536,
"f1_stderr": 0.0012666649528854216,
"acc": 0.29319675461487227,
"acc_stderr": 0.007301498172995543
},
"harness|drop|3": {
"em": 0.003145973154362416,
"em_stderr": 0.0005734993648436398,
"f1": 0.047173867449664536,
"f1_stderr": 0.0012666649528854216
},
"harness|gsm8k|5": {
"acc": 0.000758150113722517,
"acc_stderr": 0.0007581501137225237
},
"harness|winogrande|5": {
"acc": 0.585635359116022,
"acc_stderr": 0.013844846232268563
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
EricPeter/test1 | 2023-09-16T22:25:42.000Z | [
"region:us"
] | EricPeter | null | null | null | 0 | 0 | Entry not found |
MyneFactory/MF-Base-2 | 2023-09-16T22:43:54.000Z | [
"license:creativeml-openrail-m",
"region:us"
] | MyneFactory | null | null | null | 0 | 0 | ---
license: creativeml-openrail-m
---
|
WhiteAiZ/USB-Universal | 2023-09-16T22:36:22.000Z | [
"license:creativeml-openrail-m",
"region:us"
] | WhiteAiZ | null | null | null | 0 | 0 | ---
license: creativeml-openrail-m
---
|
open-llm-leaderboard/details_Brillibits__Instruct_Llama70B_Dolly15k | 2023-09-16T22:46:41.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Brillibits/Instruct_Llama70B_Dolly15k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Brillibits/Instruct_Llama70B_Dolly15k](https://huggingface.co/Brillibits/Instruct_Llama70B_Dolly15k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Brillibits__Instruct_Llama70B_Dolly15k\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-16T22:45:23.409590](https://huggingface.co/datasets/open-llm-leaderboard/details_Brillibits__Instruct_Llama70B_Dolly15k/blob/main/results_2023-09-16T22-45-23.409590.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.693972249377468,\n\
\ \"acc_stderr\": 0.03087031520778702,\n \"acc_norm\": 0.6980333583932294,\n\
\ \"acc_norm_stderr\": 0.030840132212392974,\n \"mc1\": 0.3219094247246022,\n\
\ \"mc1_stderr\": 0.016355567611960404,\n \"mc2\": 0.46457054865454755,\n\
\ \"mc2_stderr\": 0.014082565298753024\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6382252559726962,\n \"acc_stderr\": 0.014041957945038078,\n\
\ \"acc_norm\": 0.6834470989761092,\n \"acc_norm_stderr\": 0.01359243151906808\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6777534355706034,\n\
\ \"acc_stderr\": 0.004663817291468729,\n \"acc_norm\": 0.8721370244971122,\n\
\ \"acc_norm_stderr\": 0.0033325469891901565\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8157894736842105,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.8157894736842105,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n\
\ \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8402777777777778,\n\
\ \"acc_stderr\": 0.030635578972093274,\n \"acc_norm\": 0.8402777777777778,\n\
\ \"acc_norm_stderr\": 0.030635578972093274\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062947,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062947\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6893617021276596,\n \"acc_stderr\": 0.03025123757921317,\n\
\ \"acc_norm\": 0.6893617021276596,\n \"acc_norm_stderr\": 0.03025123757921317\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.039966295748767186,\n\
\ \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.039966295748767186\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4365079365079365,\n \"acc_stderr\": 0.02554284681740049,\n \"\
acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.02554284681740049\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8193548387096774,\n\
\ \"acc_stderr\": 0.02188617856717252,\n \"acc_norm\": 0.8193548387096774,\n\
\ \"acc_norm_stderr\": 0.02188617856717252\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\"\
: 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.0301176889295036,\n\
\ \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.0301176889295036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8737373737373737,\n \"acc_stderr\": 0.023664359402880232,\n \"\
acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.023664359402880232\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.018088393839078912,\n\
\ \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.018088393839078912\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.02323458108842849,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.02323458108842849\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815642,\n\
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815642\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7773109243697479,\n \"acc_stderr\": 0.027025433498882385,\n\
\ \"acc_norm\": 0.7773109243697479,\n \"acc_norm_stderr\": 0.027025433498882385\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4370860927152318,\n \"acc_stderr\": 0.04050035722230636,\n \"\
acc_norm\": 0.4370860927152318,\n \"acc_norm_stderr\": 0.04050035722230636\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8899082568807339,\n \"acc_stderr\": 0.013419939018681203,\n \"\
acc_norm\": 0.8899082568807339,\n \"acc_norm_stderr\": 0.013419939018681203\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5740740740740741,\n \"acc_stderr\": 0.033723432716530624,\n \"\
acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.033723432716530624\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813902,\n \"\
acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813902\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8776371308016878,\n \"acc_stderr\": 0.021331741829746786,\n \
\ \"acc_norm\": 0.8776371308016878,\n \"acc_norm_stderr\": 0.021331741829746786\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8026905829596412,\n\
\ \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.8026905829596412,\n\
\ \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n\
\ \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8677685950413223,\n \"acc_stderr\": 0.03092278832044579,\n \"\
acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.03092278832044579\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037183,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037183\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8098159509202454,\n \"acc_stderr\": 0.03083349114628124,\n\
\ \"acc_norm\": 0.8098159509202454,\n \"acc_norm_stderr\": 0.03083349114628124\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.905982905982906,\n\
\ \"acc_stderr\": 0.019119892798924985,\n \"acc_norm\": 0.905982905982906,\n\
\ \"acc_norm_stderr\": 0.019119892798924985\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.859514687100894,\n\
\ \"acc_stderr\": 0.012426211353093446,\n \"acc_norm\": 0.859514687100894,\n\
\ \"acc_norm_stderr\": 0.012426211353093446\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7947976878612717,\n \"acc_stderr\": 0.021742519835276277,\n\
\ \"acc_norm\": 0.7947976878612717,\n \"acc_norm_stderr\": 0.021742519835276277\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45363128491620114,\n\
\ \"acc_stderr\": 0.016650437588269073,\n \"acc_norm\": 0.45363128491620114,\n\
\ \"acc_norm_stderr\": 0.016650437588269073\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7810457516339869,\n \"acc_stderr\": 0.02367908986180772,\n\
\ \"acc_norm\": 0.7810457516339869,\n \"acc_norm_stderr\": 0.02367908986180772\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7781350482315113,\n\
\ \"acc_stderr\": 0.02359885829286305,\n \"acc_norm\": 0.7781350482315113,\n\
\ \"acc_norm_stderr\": 0.02359885829286305\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8425925925925926,\n \"acc_stderr\": 0.020263764996385717,\n\
\ \"acc_norm\": 0.8425925925925926,\n \"acc_norm_stderr\": 0.020263764996385717\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5673758865248227,\n \"acc_stderr\": 0.029555454236778845,\n \
\ \"acc_norm\": 0.5673758865248227,\n \"acc_norm_stderr\": 0.029555454236778845\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5410691003911343,\n\
\ \"acc_stderr\": 0.012727084826799804,\n \"acc_norm\": 0.5410691003911343,\n\
\ \"acc_norm_stderr\": 0.012727084826799804\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7426470588235294,\n \"acc_stderr\": 0.026556519470041503,\n\
\ \"acc_norm\": 0.7426470588235294,\n \"acc_norm_stderr\": 0.026556519470041503\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7516339869281046,\n \"acc_stderr\": 0.017479487001364764,\n \
\ \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.017479487001364764\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n\
\ \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.7272727272727273,\n\
\ \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.02560737598657916,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.02560737598657916\n },\n\
\ \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n\
\ \"acc_stderr\": 0.022076326101824657,\n \"acc_norm\": 0.8905472636815921,\n\
\ \"acc_norm_stderr\": 0.022076326101824657\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.94,\n \"acc_stderr\": 0.023868325657594162,\n \
\ \"acc_norm\": 0.94,\n \"acc_norm_stderr\": 0.023868325657594162\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n\
\ \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3219094247246022,\n\
\ \"mc1_stderr\": 0.016355567611960404,\n \"mc2\": 0.46457054865454755,\n\
\ \"mc2_stderr\": 0.014082565298753024\n }\n}\n```"
repo_url: https://huggingface.co/Brillibits/Instruct_Llama70B_Dolly15k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|arc:challenge|25_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hellaswag|10_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-16T22-45-23.409590.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-16T22-45-23.409590.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-16T22-45-23.409590.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-16T22-45-23.409590.parquet'
- config_name: results
data_files:
- split: 2023_09_16T22_45_23.409590
path:
- results_2023-09-16T22-45-23.409590.parquet
- split: latest
path:
- results_2023-09-16T22-45-23.409590.parquet
---
# Dataset Card for Evaluation run of Brillibits/Instruct_Llama70B_Dolly15k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Brillibits/Instruct_Llama70B_Dolly15k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Brillibits/Instruct_Llama70B_Dolly15k](https://huggingface.co/Brillibits/Instruct_Llama70B_Dolly15k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Brillibits__Instruct_Llama70B_Dolly15k",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T22:45:23.409590](https://huggingface.co/datasets/open-llm-leaderboard/details_Brillibits__Instruct_Llama70B_Dolly15k/blob/main/results_2023-09-16T22-45-23.409590.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.693972249377468,
"acc_stderr": 0.03087031520778702,
"acc_norm": 0.6980333583932294,
"acc_norm_stderr": 0.030840132212392974,
"mc1": 0.3219094247246022,
"mc1_stderr": 0.016355567611960404,
"mc2": 0.46457054865454755,
"mc2_stderr": 0.014082565298753024
},
"harness|arc:challenge|25": {
"acc": 0.6382252559726962,
"acc_stderr": 0.014041957945038078,
"acc_norm": 0.6834470989761092,
"acc_norm_stderr": 0.01359243151906808
},
"harness|hellaswag|10": {
"acc": 0.6777534355706034,
"acc_stderr": 0.004663817291468729,
"acc_norm": 0.8721370244971122,
"acc_norm_stderr": 0.0033325469891901565
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8157894736842105,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.8157894736842105,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8402777777777778,
"acc_stderr": 0.030635578972093274,
"acc_norm": 0.8402777777777778,
"acc_norm_stderr": 0.030635578972093274
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062947,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062947
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6893617021276596,
"acc_stderr": 0.03025123757921317,
"acc_norm": 0.6893617021276596,
"acc_norm_stderr": 0.03025123757921317
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6413793103448275,
"acc_stderr": 0.039966295748767186,
"acc_norm": 0.6413793103448275,
"acc_norm_stderr": 0.039966295748767186
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.02554284681740049,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.02554284681740049
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8193548387096774,
"acc_stderr": 0.02188617856717252,
"acc_norm": 0.8193548387096774,
"acc_norm_stderr": 0.02188617856717252
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.0301176889295036,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.0301176889295036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8737373737373737,
"acc_stderr": 0.023664359402880232,
"acc_norm": 0.8737373737373737,
"acc_norm_stderr": 0.023664359402880232
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9326424870466321,
"acc_stderr": 0.018088393839078912,
"acc_norm": 0.9326424870466321,
"acc_norm_stderr": 0.018088393839078912
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7,
"acc_stderr": 0.02323458108842849,
"acc_norm": 0.7,
"acc_norm_stderr": 0.02323458108842849
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815642,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815642
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7773109243697479,
"acc_stderr": 0.027025433498882385,
"acc_norm": 0.7773109243697479,
"acc_norm_stderr": 0.027025433498882385
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4370860927152318,
"acc_stderr": 0.04050035722230636,
"acc_norm": 0.4370860927152318,
"acc_norm_stderr": 0.04050035722230636
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8899082568807339,
"acc_stderr": 0.013419939018681203,
"acc_norm": 0.8899082568807339,
"acc_norm_stderr": 0.013419939018681203
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.033723432716530624,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.033723432716530624
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.019398452135813902,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.019398452135813902
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8776371308016878,
"acc_stderr": 0.021331741829746786,
"acc_norm": 0.8776371308016878,
"acc_norm_stderr": 0.021331741829746786
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8026905829596412,
"acc_stderr": 0.02670985334496796,
"acc_norm": 0.8026905829596412,
"acc_norm_stderr": 0.02670985334496796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.03092278832044579,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.03092278832044579
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037183,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037183
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8098159509202454,
"acc_stderr": 0.03083349114628124,
"acc_norm": 0.8098159509202454,
"acc_norm_stderr": 0.03083349114628124
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.03675668832233188,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.03675668832233188
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.905982905982906,
"acc_stderr": 0.019119892798924985,
"acc_norm": 0.905982905982906,
"acc_norm_stderr": 0.019119892798924985
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.859514687100894,
"acc_stderr": 0.012426211353093446,
"acc_norm": 0.859514687100894,
"acc_norm_stderr": 0.012426211353093446
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7947976878612717,
"acc_stderr": 0.021742519835276277,
"acc_norm": 0.7947976878612717,
"acc_norm_stderr": 0.021742519835276277
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45363128491620114,
"acc_stderr": 0.016650437588269073,
"acc_norm": 0.45363128491620114,
"acc_norm_stderr": 0.016650437588269073
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7810457516339869,
"acc_stderr": 0.02367908986180772,
"acc_norm": 0.7810457516339869,
"acc_norm_stderr": 0.02367908986180772
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7781350482315113,
"acc_stderr": 0.02359885829286305,
"acc_norm": 0.7781350482315113,
"acc_norm_stderr": 0.02359885829286305
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.020263764996385717,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.020263764996385717
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5673758865248227,
"acc_stderr": 0.029555454236778845,
"acc_norm": 0.5673758865248227,
"acc_norm_stderr": 0.029555454236778845
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5410691003911343,
"acc_stderr": 0.012727084826799804,
"acc_norm": 0.5410691003911343,
"acc_norm_stderr": 0.012727084826799804
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7426470588235294,
"acc_stderr": 0.026556519470041503,
"acc_norm": 0.7426470588235294,
"acc_norm_stderr": 0.026556519470041503
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.017479487001364764,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.017479487001364764
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940588,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940588
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8,
"acc_stderr": 0.02560737598657916,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02560737598657916
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.022076326101824657,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.022076326101824657
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.94,
"acc_stderr": 0.023868325657594162,
"acc_norm": 0.94,
"acc_norm_stderr": 0.023868325657594162
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3219094247246022,
"mc1_stderr": 0.016355567611960404,
"mc2": 0.46457054865454755,
"mc2_stderr": 0.014082565298753024
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
youngbrett48/idk | 2023-09-16T22:59:08.000Z | [
"region:us"
] | youngbrett48 | null | null | null | 0 | 0 | Entry not found |
liyucheng/trivia_qa_wiki | 2023-09-16T23:12:13.000Z | [
"region:us"
] | liyucheng | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: question_id
dtype: string
- name: question_source
dtype: string
- name: entity_pages
sequence:
- name: doc_source
dtype: string
- name: filename
dtype: string
- name: title
dtype: string
- name: wiki_context
dtype: string
- name: search_results
sequence:
- name: description
dtype: string
- name: filename
dtype: string
- name: rank
dtype: int32
- name: title
dtype: string
- name: url
dtype: string
- name: search_context
dtype: string
- name: answer
struct:
- name: aliases
sequence: string
- name: normalized_aliases
sequence: string
- name: matched_wiki_entity_name
dtype: string
- name: normalized_matched_wiki_entity_name
dtype: string
- name: normalized_value
dtype: string
- name: type
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 3340799992
num_examples: 61888
- name: validation
num_bytes: 430166050
num_examples: 7993
- name: test
num_bytes: 406046504
num_examples: 7701
download_size: 2293374081
dataset_size: 4177012546
---
# Dataset Card for "trivia_qa_wiki"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/baba_konomi_theidolmstermillionlive | 2023-09-17T17:43:29.000Z | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | CyberHarem | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of baba_konomi (THE iDOLM@STER: Million Live!)
This is the dataset of baba_konomi (THE iDOLM@STER: Million Live!), containing 199 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 199 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 535 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 199 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 199 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 199 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 199 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 199 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 535 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 535 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 535 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_bavest__fin-llama-33b-merged | 2023-09-16T23:28:58.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of bavest/fin-llama-33b-merged
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bavest/fin-llama-33b-merged](https://huggingface.co/bavest/fin-llama-33b-merged)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bavest__fin-llama-33b-merged\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-16T23:28:46.893925](https://huggingface.co/datasets/open-llm-leaderboard/details_bavest__fin-llama-33b-merged/blob/main/results_2023-09-16T23-28-46.893925.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0018875838926174498,\n\
\ \"em_stderr\": 0.0004445109990558753,\n \"f1\": 0.06358221476510076,\n\
\ \"f1_stderr\": 0.0013748196874116337,\n \"acc\": 0.48127991536483655,\n\
\ \"acc_stderr\": 0.010695229631509682\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0018875838926174498,\n \"em_stderr\": 0.0004445109990558753,\n\
\ \"f1\": 0.06358221476510076,\n \"f1_stderr\": 0.0013748196874116337\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.16224412433661864,\n \
\ \"acc_stderr\": 0.010155130880393522\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8003157063930545,\n \"acc_stderr\": 0.011235328382625842\n\
\ }\n}\n```"
repo_url: https://huggingface.co/bavest/fin-llama-33b-merged
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_09_16T23_28_46.893925
path:
- '**/details_harness|drop|3_2023-09-16T23-28-46.893925.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-16T23-28-46.893925.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_16T23_28_46.893925
path:
- '**/details_harness|gsm8k|5_2023-09-16T23-28-46.893925.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-16T23-28-46.893925.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_16T23_28_46.893925
path:
- '**/details_harness|winogrande|5_2023-09-16T23-28-46.893925.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-16T23-28-46.893925.parquet'
- config_name: results
data_files:
- split: 2023_09_16T23_28_46.893925
path:
- results_2023-09-16T23-28-46.893925.parquet
- split: latest
path:
- results_2023-09-16T23-28-46.893925.parquet
---
# Dataset Card for Evaluation run of bavest/fin-llama-33b-merged
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/bavest/fin-llama-33b-merged
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [bavest/fin-llama-33b-merged](https://huggingface.co/bavest/fin-llama-33b-merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bavest__fin-llama-33b-merged",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T23:28:46.893925](https://huggingface.co/datasets/open-llm-leaderboard/details_bavest__fin-llama-33b-merged/blob/main/results_2023-09-16T23-28-46.893925.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0018875838926174498,
"em_stderr": 0.0004445109990558753,
"f1": 0.06358221476510076,
"f1_stderr": 0.0013748196874116337,
"acc": 0.48127991536483655,
"acc_stderr": 0.010695229631509682
},
"harness|drop|3": {
"em": 0.0018875838926174498,
"em_stderr": 0.0004445109990558753,
"f1": 0.06358221476510076,
"f1_stderr": 0.0013748196874116337
},
"harness|gsm8k|5": {
"acc": 0.16224412433661864,
"acc_stderr": 0.010155130880393522
},
"harness|winogrande|5": {
"acc": 0.8003157063930545,
"acc_stderr": 0.011235328382625842
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
boardsec/yara_dataset_v1 | 2023-09-17T00:30:43.000Z | [
"region:us"
] | boardsec | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: Chunk
dtype: string
- name: yara_rule
dtype: string
- name: cleaned_yara_rule
dtype: string
splits:
- name: train
num_bytes: 33823
num_examples: 67
download_size: 14543
dataset_size: 33823
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "yara_dataset_v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.