datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
CyberHarem/minsk_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of minsk/ミンスク/明斯克 (Azur Lane)
This is the dataset of minsk/ミンスク/明斯克 (Azur Lane), containing 33 images and their tags.
The core tags of this character are `long_hair, grey_hair, purple_eyes, hat, breasts, multicolored_hair, very_long_hair, bangs, peaked_cap, white_headwear, streaked_hair, black_hair, fang, hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 33 | 52.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/minsk_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 33 | 25.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/minsk_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 79 | 55.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/minsk_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 33 | 44.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/minsk_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 79 | 84.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/minsk_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/minsk_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, black_gloves, looking_at_viewer, midriff, solo, suspenders, white_shirt, black_headwear, black_shorts, crop_top, navel, short_shorts, belt, handcuffs, stomach, black_footwear, black_necktie, id_card, low_ponytail, open_mouth, knee_boots, two-tone_hair, white_background, :d, collared_shirt, full_body, holding_gun |
| 1 | 17 |  |  |  |  |  | 1girl, looking_at_viewer, solo, fur_trim, smile, cleavage, open_mouth, pleated_skirt, black_thighhighs, blue_skirt, simple_background, white_background, blush, coat, large_breasts, belt, zettai_ryouiki |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_gloves | looking_at_viewer | midriff | solo | suspenders | white_shirt | black_headwear | black_shorts | crop_top | navel | short_shorts | belt | handcuffs | stomach | black_footwear | black_necktie | id_card | low_ponytail | open_mouth | knee_boots | two-tone_hair | white_background | :d | collared_shirt | full_body | holding_gun | fur_trim | smile | cleavage | pleated_skirt | black_thighhighs | blue_skirt | simple_background | blush | coat | large_breasts | zettai_ryouiki |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------------------|:----------|:-------|:-------------|:--------------|:-----------------|:---------------|:-----------|:--------|:---------------|:-------|:------------|:----------|:-----------------|:----------------|:----------|:---------------|:-------------|:-------------|:----------------|:-------------------|:-----|:-----------------|:------------|:--------------|:-----------|:--------|:-----------|:----------------|:-------------------|:-------------|:--------------------|:--------|:-------|:----------------|:-----------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 17 |  |  |  |  |  | X | | X | | X | | | | | | | | X | | | | | | | X | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X |
|
eagle0504/larkin-web-scrape-dataset-qa-formatted-small-version | ---
dataset_info:
features:
- name: questions
dtype: string
- name: answers
dtype: string
splits:
- name: train
num_bytes: 7792
num_examples: 20
download_size: 11498
dataset_size: 7792
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
SGBTalha/EUMesmo123 | ---
license: openrail
---
|
CyberHarem/seydlitz_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of seydlitz/ザイドリッツ/塞德利茨 (Azur Lane)
This is the dataset of seydlitz/ザイドリッツ/塞德利茨 (Azur Lane), containing 42 images and their tags.
The core tags of this character are `blue_eyes, pink_hair, breasts, hat, black_headwear, bangs, hair_between_eyes, peaked_cap, short_hair, military_hat`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 42 | 63.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/seydlitz_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 42 | 31.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/seydlitz_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 86 | 63.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/seydlitz_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 42 | 55.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/seydlitz_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 86 | 99.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/seydlitz_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/seydlitz_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, solo, red_necktie, looking_at_viewer, upper_body, cape, iron_cross, military_uniform, simple_background, blush, white_gloves |
| 1 | 10 |  |  |  |  |  | 1girl, solo, looking_at_viewer, white_gloves, military_uniform, red_necktie, thighhighs, dress, standing, holding_sword, black_cape, double-breasted, full_body, thigh_boots |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | red_necktie | looking_at_viewer | upper_body | cape | iron_cross | military_uniform | simple_background | blush | white_gloves | thighhighs | dress | standing | holding_sword | black_cape | double-breasted | full_body | thigh_boots |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------|:--------------------|:-------------|:-------|:-------------|:-------------------|:--------------------|:--------|:---------------|:-------------|:--------|:-----------|:----------------|:-------------|:------------------|:------------|:--------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | | | | X | | | X | X | X | X | X | X | X | X | X |
|
rmgravina/jurisprudencia__0004173 | ---
license: unknown
---
|
jjjaehee/customcoopang | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5826
num_examples: 39
download_size: 2572
dataset_size: 5826
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/kiichi_hogen_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kiichi_hogen/鬼一法眼/鬼一法眼 (Fate/Grand Order)
This is the dataset of kiichi_hogen/鬼一法眼/鬼一法眼 (Fate/Grand Order), containing 35 images and their tags.
The core tags of this character are `long_hair, white_hair, breasts, very_long_hair, horns, pointy_ears, bangs, orange_eyes, yellow_eyes, large_breasts, tassel, sidelocks`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 35 | 50.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kiichi_hogen_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 35 | 32.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kiichi_hogen_fgo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 78 | 61.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kiichi_hogen_fgo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 35 | 46.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kiichi_hogen_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 78 | 80.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kiichi_hogen_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kiichi_hogen_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, bare_shoulders, black_gloves, cleavage_cutout, looking_at_viewer, smile, solo, red_armor, white_dress, armored_dress, feathers, navel_cutout, blush, thighs, spear |
| 1 | 7 |  |  |  |  |  | 1girl, black_gloves, looking_at_viewer, smile, solo, feathers, navel_cutout, red_armor, spear, cleavage |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | black_gloves | cleavage_cutout | looking_at_viewer | smile | solo | red_armor | white_dress | armored_dress | feathers | navel_cutout | blush | thighs | spear | cleavage |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:---------------|:------------------|:--------------------|:--------|:-------|:------------|:--------------|:----------------|:-----------|:---------------|:--------|:---------|:--------|:-----------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | |
| 1 | 7 |  |  |  |  |  | X | | X | | X | X | X | X | | | X | X | | | X | X |
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/fa1be0f1 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 186
num_examples: 10
download_size: 1336
dataset_size: 186
---
# Dataset Card for "fa1be0f1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
guydegnol/bulkhours | ---
license: apache-2.0
---
Support data for bulkhours |
biscayan/common_voice_16_1_ko_pseudo_labelled | ---
dataset_info:
config_name: ko
features:
- name: client_id
dtype: string
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
- name: up_votes
dtype: int64
- name: down_votes
dtype: int64
- name: age
dtype: string
- name: gender
dtype: string
- name: accent
dtype: string
- name: locale
dtype: string
- name: segment
dtype: string
- name: variant
dtype: string
- name: whisper_transcript
sequence: int64
splits:
- name: train
num_bytes: 14882451.0
num_examples: 401
- name: validation
num_bytes: 7425997.0
num_examples: 235
- name: test
num_bytes: 9091809.0
num_examples: 282
download_size: 30261851
dataset_size: 31400257.0
configs:
- config_name: ko
data_files:
- split: train
path: ko/train-*
- split: validation
path: ko/validation-*
- split: test
path: ko/test-*
---
|
vietgpt/github | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 196563585312
num_examples: 28793312
download_size: 64794312270
dataset_size: 196563585312
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "github"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/sayu_genshin | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of sayu/早柚/早柚 (Genshin Impact)
This is the dataset of sayu/早柚/早柚 (Genshin Impact), containing 434 images and their tags.
The core tags of this character are `short_hair, animal_ears, blunt_bangs, raccoon_ears, leaf_on_head, fake_animal_ears, grey_hair, tail, raccoon_tail, fake_tail, purple_eyes, sidelocks`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 434 | 571.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sayu_genshin/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 434 | 491.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sayu_genshin/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1042 | 993.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sayu_genshin/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sayu_genshin',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, animal_hood, black_gloves, child, leaf, looking_at_viewer, obi, short_sleeves, simple_background, solo, white_background, black_shorts, fingerless_gloves, full_body, ninja, toeless_footwear, black_scarf, shuriken, :o, black_footwear, parted_lips, bike_shorts, elbow_gloves, pouch, short_kimono, short_shorts, blush, holding, open_mouth, standing |
| 1 | 5 |  |  |  |  |  | 1girl, animal_hood, black_gloves, fingerless_gloves, kimono, leaf, looking_at_viewer, obi, short_sleeves, simple_background, solo, arm_guards, black_scarf, child, elbow_gloves, ninja, white_background, blush, red_eyes, upper_body |
| 2 | 5 |  |  |  |  |  | 1girl, animal_hood, black_gloves, black_scarf, fingerless_gloves, japanese_clothes, leaf, looking_at_viewer, ninja, solo, arm_guards, short_sleeves, simple_background, shuriken, blush, parted_lips, upper_body, white_background |
| 3 | 5 |  |  |  |  |  | 1girl, animal_hood, japanese_clothes, leaf, looking_at_viewer, simple_background, solo, white_background, black_scarf, child, ninja, portrait, upper_body, weapon |
| 4 | 15 |  |  |  |  |  | 1boy, 1girl, hetero, blush, loli, penis, spread_legs, animal_hood, navel, nipples, flat_chest, sex, vaginal, open_mouth, mosaic_censoring, black_scarf, solo_focus, leaf, nude, black_gloves, leg_grab, on_back, outdoors, cum_in_pussy, cum_overflow, hood_up |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | animal_hood | black_gloves | child | leaf | looking_at_viewer | obi | short_sleeves | simple_background | solo | white_background | black_shorts | fingerless_gloves | full_body | ninja | toeless_footwear | black_scarf | shuriken | :o | black_footwear | parted_lips | bike_shorts | elbow_gloves | pouch | short_kimono | short_shorts | blush | holding | open_mouth | standing | kimono | arm_guards | red_eyes | upper_body | japanese_clothes | portrait | weapon | 1boy | hetero | loli | penis | spread_legs | navel | nipples | flat_chest | sex | vaginal | mosaic_censoring | solo_focus | nude | leg_grab | on_back | outdoors | cum_in_pussy | cum_overflow | hood_up |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:---------------|:--------|:-------|:--------------------|:------|:----------------|:--------------------|:-------|:-------------------|:---------------|:--------------------|:------------|:--------|:-------------------|:--------------|:-----------|:-----|:-----------------|:--------------|:--------------|:---------------|:--------|:---------------|:---------------|:--------|:----------|:-------------|:-----------|:---------|:-------------|:-----------|:-------------|:-------------------|:-----------|:---------|:-------|:---------|:-------|:--------|:--------------|:--------|:----------|:-------------|:------|:----------|:-------------------|:-------------|:-------|:-----------|:----------|:-----------|:---------------|:---------------|:----------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | X | | X | | X | | | | | | X | | | | X | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | | X | X | | X | X | X | X | | X | | X | | X | X | | | X | | | | | | X | | | | | X | | X | X | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | | X | X | X | | | X | X | X | | | | X | | X | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 4 | 15 |  |  |  |  |  | X | X | X | | X | | | | | | | | | | | | X | | | | | | | | | | X | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
autoevaluate/autoeval-staging-eval-squad_v2-squad_v2-38b250-14916082 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- squad_v2
eval_info:
task: extractive_question_answering
model: deepset/minilm-uncased-squad2
metrics: ['bertscore']
dataset_name: squad_v2
dataset_config: squad_v2
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: deepset/minilm-uncased-squad2
* Dataset: squad_v2
* Config: squad_v2
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@nonchalant-nagavalli](https://huggingface.co/nonchalant-nagavalli) for evaluating this model. |
whitefox44/ReflectionGPT4 | ---
license: apache-2.0
---
|
s-nlp/TextGraphs17-shared-task-dataset | ---
task_categories:
- question-answering
language:
- en
---
We present a dataset for graph-based question answering. The dataset consists of <question; candidate answer> pairs. For each candidate, we present a graph that is obtained by finding the shortest path between named entities mentioned in a question and a candidate answer. As a knowledge graph, we adopted Wikidata. Our dataset has the following fields:
* **sample_id** - an identifier for <question, candidate answer>;
* **question** - question text;
* **questionEntity** - comma-separated list of names (textual strings) for Wikidata concepts mentioned in a given question;
* **answerEntity** - a textual name of candidate answer (candidate is a concept from Wikidata) for the given question;
* **groundTruthAnswerEntity** - a textual name of ground truth answer (answer is a concept from Wikidata) for the given question;
* **answerEntityId** - a Wikidata id of candidate answer (see "answerEntity" column). Example: "Q2599";
* **questionEntityId** - a comma-separated list of Wikidata ids for concepts mentioned in a given question (list of ids for mentions from "questionEntity" column);
* **groundTruthAnswerEntityId** - a Wikidata id of ground truth answer (see "answerEntity" column). Example: "Q148234";
* **correct** - either "True" or "False". The field indicates whether a <question, answer candidate> is correct, i.e., candidate answer is a true answer to the given question;
* **graph** - a shortest-path graph for a given <question, candidate answer> pair. The graph is obtained by taking the shortest paths from all mentioned concepts ("questionEntityId" column) to a candidate answer("answerEntityId" column) in the knowledge graph of Wikidata. The graph is stored in "node-link" JSON format from NetworkX. You can import the graph using the [node_link_graph](https://networkx.org/documentation/stable/reference/readwrite/generated/networkx.readwrite.json_graph.node_link_graph.html).
Please see our [Github](https://github.com/uhh-lt/TextGraphs17-shared-task.git) for baselines, and useful code. |
Aeronsc00ll0l/Smth | ---
license: apache-2.0
---
|
joshitoppo/toppo | ---
license: bigscience-openrail-m
---
|
Nexdata/10000_Image_caption_data_of_diverse_scenes | ---
license: cc-by-nc-nd-4.0
---
## Description
20,000 Image caption data of diverse scenes including natural scenes, urban street scenes, exhibitions, family environments and other scenes, shot with different brands of cameras, including multiple time periods, multiple shooting angles, description language is English, mainly describes the main scenes in the image, usually including foreground and background description.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1283?source=Huggingface
## Data size
10,000 images
## Collection environment
including natural scenes, urban street scenes, shopping mall scenes, exhibitions, family environment, displays and other scenes
## Acquisition equipment
various brands of cameras
## Collection diversity
multiple scenes, multiple time periods, multiple shooting angles
## Data format
image format is .jpg, text format is .txt
## Description language
English, Chinese
## Text length
in principle, 30~60 words, usually 3-5 sentences
## Main description content
the main scene in the image, usually including foreground and background description
## Accuracy rate
the proportion of correctly labeled images is not less than 97%
# Licensing Information
Commercial License
|
Miracle-dz/newarxiv | ---
license: other
---
|
open-llm-leaderboard/details_DenisTheDev__Blitz-AI-MOE-v0.4 | ---
pretty_name: Evaluation run of DenisTheDev/Blitz-AI-MOE-v0.4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DenisTheDev/Blitz-AI-MOE-v0.4](https://huggingface.co/DenisTheDev/Blitz-AI-MOE-v0.4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DenisTheDev__Blitz-AI-MOE-v0.4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-22T03:29:31.187340](https://huggingface.co/datasets/open-llm-leaderboard/details_DenisTheDev__Blitz-AI-MOE-v0.4/blob/main/results_2024-03-22T03-29-31.187340.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6441786289728297,\n\
\ \"acc_stderr\": 0.03226389476881219,\n \"acc_norm\": 0.6463478032255064,\n\
\ \"acc_norm_stderr\": 0.032907165671677265,\n \"mc1\": 0.3635250917992656,\n\
\ \"mc1_stderr\": 0.01683886288396583,\n \"mc2\": 0.5355025455189468,\n\
\ \"mc2_stderr\": 0.01539835384962678\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6237201365187713,\n \"acc_stderr\": 0.014157022555407163,\n\
\ \"acc_norm\": 0.6629692832764505,\n \"acc_norm_stderr\": 0.013813476652902279\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6697868950408286,\n\
\ \"acc_stderr\": 0.004693285694663837,\n \"acc_norm\": 0.8559051981676957,\n\
\ \"acc_norm_stderr\": 0.0035046810917039014\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237103,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237103\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.543859649122807,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.543859649122807,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192118,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192118\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305526,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305526\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n\
\ \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n\
\ \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586804,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586804\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709443,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.02412112541694119,\n \
\ \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.02412112541694119\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131143,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131143\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8293577981651377,\n \"acc_stderr\": 0.016129271025099857,\n \"\
acc_norm\": 0.8293577981651377,\n \"acc_norm_stderr\": 0.016129271025099857\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5462962962962963,\n \"acc_stderr\": 0.033953227263757976,\n \"\
acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.033953227263757976\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073332,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073332\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.013547415658662264,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.013547415658662264\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468348,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468348\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2759776536312849,\n\
\ \"acc_stderr\": 0.014950103002475356,\n \"acc_norm\": 0.2759776536312849,\n\
\ \"acc_norm_stderr\": 0.014950103002475356\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875192,\n\
\ \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875192\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7459807073954984,\n\
\ \"acc_stderr\": 0.024723861504771696,\n \"acc_norm\": 0.7459807073954984,\n\
\ \"acc_norm_stderr\": 0.024723861504771696\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042103,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042103\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46153846153846156,\n\
\ \"acc_stderr\": 0.012732398286190442,\n \"acc_norm\": 0.46153846153846156,\n\
\ \"acc_norm_stderr\": 0.012732398286190442\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.027678468642144714,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.027678468642144714\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724556,\n \
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724556\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.029393609319879804,\n\
\ \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.029393609319879804\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3635250917992656,\n\
\ \"mc1_stderr\": 0.01683886288396583,\n \"mc2\": 0.5355025455189468,\n\
\ \"mc2_stderr\": 0.01539835384962678\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7845303867403315,\n \"acc_stderr\": 0.011555295286059282\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.601213040181956,\n \
\ \"acc_stderr\": 0.013487360477060834\n }\n}\n```"
repo_url: https://huggingface.co/DenisTheDev/Blitz-AI-MOE-v0.4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|arc:challenge|25_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|gsm8k|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hellaswag|10_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T03-29-31.187340.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T03-29-31.187340.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- '**/details_harness|winogrande|5_2024-03-22T03-29-31.187340.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-22T03-29-31.187340.parquet'
- config_name: results
data_files:
- split: 2024_03_22T03_29_31.187340
path:
- results_2024-03-22T03-29-31.187340.parquet
- split: latest
path:
- results_2024-03-22T03-29-31.187340.parquet
---
# Dataset Card for Evaluation run of DenisTheDev/Blitz-AI-MOE-v0.4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DenisTheDev/Blitz-AI-MOE-v0.4](https://huggingface.co/DenisTheDev/Blitz-AI-MOE-v0.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DenisTheDev__Blitz-AI-MOE-v0.4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-22T03:29:31.187340](https://huggingface.co/datasets/open-llm-leaderboard/details_DenisTheDev__Blitz-AI-MOE-v0.4/blob/main/results_2024-03-22T03-29-31.187340.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6441786289728297,
"acc_stderr": 0.03226389476881219,
"acc_norm": 0.6463478032255064,
"acc_norm_stderr": 0.032907165671677265,
"mc1": 0.3635250917992656,
"mc1_stderr": 0.01683886288396583,
"mc2": 0.5355025455189468,
"mc2_stderr": 0.01539835384962678
},
"harness|arc:challenge|25": {
"acc": 0.6237201365187713,
"acc_stderr": 0.014157022555407163,
"acc_norm": 0.6629692832764505,
"acc_norm_stderr": 0.013813476652902279
},
"harness|hellaswag|10": {
"acc": 0.6697868950408286,
"acc_stderr": 0.004693285694663837,
"acc_norm": 0.8559051981676957,
"acc_norm_stderr": 0.0035046810917039014
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237103,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237103
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.049135952012744975,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.049135952012744975
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.543859649122807,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.543859649122807,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192118,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192118
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.02535574126305526,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.02535574126305526
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586804,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586804
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709443,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.02412112541694119,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.02412112541694119
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131143,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131143
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8293577981651377,
"acc_stderr": 0.016129271025099857,
"acc_norm": 0.8293577981651377,
"acc_norm_stderr": 0.016129271025099857
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073332,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.013547415658662264,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.013547415658662264
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468348,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468348
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2759776536312849,
"acc_stderr": 0.014950103002475356,
"acc_norm": 0.2759776536312849,
"acc_norm_stderr": 0.014950103002475356
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875192,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875192
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7459807073954984,
"acc_stderr": 0.024723861504771696,
"acc_norm": 0.7459807073954984,
"acc_norm_stderr": 0.024723861504771696
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042103,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042103
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46153846153846156,
"acc_stderr": 0.012732398286190442,
"acc_norm": 0.46153846153846156,
"acc_norm_stderr": 0.012732398286190442
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.027678468642144714,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.027678468642144714
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724556,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724556
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.029393609319879804,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.029393609319879804
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3635250917992656,
"mc1_stderr": 0.01683886288396583,
"mc2": 0.5355025455189468,
"mc2_stderr": 0.01539835384962678
},
"harness|winogrande|5": {
"acc": 0.7845303867403315,
"acc_stderr": 0.011555295286059282
},
"harness|gsm8k|5": {
"acc": 0.601213040181956,
"acc_stderr": 0.013487360477060834
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
comet-team/cppe-5 | ---
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: width
dtype: int32
- name: height
dtype: int32
- name: objects
sequence:
- name: id
dtype: int64
- name: area
dtype: int64
- name: bbox
sequence: float32
length: 4
- name: category
dtype:
class_label:
names:
'0': Coverall
'1': Face_Shield
'2': Gloves
'3': Goggles
'4': Mask
splits:
- name: train
num_bytes: 240463364.0
num_examples: 1000
- name: test
num_bytes: 4172164.0
num_examples: 29
download_size: 239989523
dataset_size: 244635528.0
---
# Dataset Card for "cppe-5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ToshaTang/YPshappo | ---
license: openrail
---
|
IndonesiaAI/1000-sample | ---
dataset_info:
features:
- name: qid
dtype: int64
- name: question
dtype: string
- name: date
dtype: string
- name: metadata
sequence: string
- name: response_j
dtype: string
- name: response_k
dtype: string
splits:
- name: train
num_bytes: 266536
num_examples: 100
download_size: 183818
dataset_size: 266536
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "1000-sample"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
juancopi81/diana_uribe_large_ada_embeddings | ---
dataset_info:
features:
- name: TITLE
dtype: string
- name: URL
dtype: string
- name: TRANSCRIPTION
dtype: string
- name: transcription_length
dtype: int64
- name: text
dtype: string
- name: ada_embedding
dtype: string
splits:
- name: train
num_bytes: 128412971
num_examples: 3215
download_size: 83354282
dataset_size: 128412971
---
# Dataset Card for "diana_uribe_large_ada_embeddings"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jopd/SD_Upscaler | ---
license: mit
---
Stable Diffusion Upscaler Model |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_60 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1299177972.0
num_examples: 255141
download_size: 1320270226
dataset_size: 1299177972.0
---
# Dataset Card for "chunk_60"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yuncongli/chat-sentiment-analysis | ---
license: mit
language:
- en
tags:
- sentiment
- aspect-based sentiment analysis
- Aspect Term Extraction
- Opinion Term Extraction
- Aspect Term-Opinion Term Pair Extraction
- Aspect term, Sentiment, Opinion term Triplet Extraction
- Aspect Category Detection
- Aspect Category-Sentiment Pair Extraction
- Aspect-Category-Opinion-Sentiment (ACOS) Quadruple Extraction
- Holder, Target, Opinion, Sentiment (HTOS) Quadruple Extraction
- sentiment analysis
---
# A Sentiment Analsysis Dataset for Finetuning Large Models in Chat-style
More details can be found at https://github.com/l294265421/chat-sentiment-analysis
## Supported Tasks
- Aspect Term Extraction (ATE)
- Opinion Term Extraction (OTE)
- Aspect Term-Opinion Term Pair Extraction (AOPE)
- Aspect term, Sentiment, Opinion term Triplet Extraction (ASOTE)
- Aspect Category Detection (ACD)
- Aspect Category-Sentiment Pair Extraction (ACSA)
- [Aspect-Category-Opinion-Sentiment (ACOS) Quadruple Extraction](https://github.com/NUSTM/ACOS)
- [Holder, Target, Opinion, Sentiment (HTOS) Quadruple Extraction](https://github.com/jerbarnes/semeval22_structured_sentiment)
|
Clip11/clip13 | ---
license: apache-2.0
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-43000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 672050
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
thoddnn/OpenDataGen-factuality-en-v0.1 | ---
license: mit
task_categories:
- question-answering
language:
- en
tags:
- wikipedia
- synthetic
- synthetic data
size_categories:
- n<1K
---
This synthetic dataset was generated using the Open DataGen Python library. (https://github.com/thoddnn/open-datagen)
# Methodology:
1) Retrieve random article content from the HuggingFace Wikipedia English dataset.
2) Construct a Chain of Thought (CoT) to generate a Multiple Choice Question (MCQ).
3) Utilize a Large Language Model (LLM) to score the results then filter it.
All these steps are prompted in the 'template.json' file located in the specified code folder.
Code: https://github.com/thoddnn/open-datagen/blob/main/opendatagen/examples/opendata-eval/
Feel free to reach me on Linkedin (https://www.linkedin.com/in/thomasdordonne/) or Twitter (https://twitter.com/thoDdnn) |
c4ba/mckako | ---
license: openrail
---
|
autoevaluate/autoeval-eval-jeffdshen__redefine_math0_8shot-jeffdshen__redefine_mat-1c694b-1853263422 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- jeffdshen/redefine_math0_8shot
eval_info:
task: text_zero_shot_classification
model: inverse-scaling/opt-66b_eval
metrics: []
dataset_name: jeffdshen/redefine_math0_8shot
dataset_config: jeffdshen--redefine_math0_8shot
dataset_split: train
col_mapping:
text: prompt
classes: classes
target: answer_index
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: inverse-scaling/opt-66b_eval
* Dataset: jeffdshen/redefine_math0_8shot
* Config: jeffdshen--redefine_math0_8shot
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@jeffdshen](https://huggingface.co/jeffdshen) for evaluating this model. |
jayalakshmiK/Arakooai | ---
license: llama2
---
|
HuggingFaceH4/cai-conversation-harmless | ---
license: apache-2.0
dataset_info:
features:
- name: init_prompt
dtype: string
- name: init_response
dtype: string
- name: critic_prompt
dtype: string
- name: critic_response
dtype: string
- name: revision_prompt
dtype: string
- name: revision_response
dtype: string
- name: prompt
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train_sft
num_bytes: 81872474
num_examples: 21268
- name: train_prefs
num_bytes: 82070344
num_examples: 21269
- name: test_sft
num_bytes: 4489276
num_examples: 1156
- name: test_prefs
num_bytes: 4523043
num_examples: 1156
download_size: 74758771
dataset_size: 172955137
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
- split: train_prefs
path: data/train_prefs-*
- split: test_sft
path: data/test_sft-*
- split: test_prefs
path: data/test_prefs-*
---
# Dataset Card for "cai-conversation-dev1705629166"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
edbeeching/prj_gia_dataset_atari_2B_atari_alien_1111 | ---
library_name: gia
tags:
- deep-reinforcement-learning
- reinforcement-learning
- gia
- multi-task
- multi-modal
- imitation-learning
- offline-reinforcement-learning
---
An imitation learning environment for the atari_alien environment, sample for the policy atari_2B_atari_alien_1111
This environment was created as part of the Generally Intelligent Agents project gia: https://github.com/huggingface/gia
|
AdapterOcean/pythonbook-standardized_cluster_0_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 8300399
num_examples: 2573
download_size: 0
dataset_size: 8300399
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "pythonbook-standardized_cluster_0_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
burningfire123/dreambooth-hackathon-images | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 119417305.0
num_examples: 833
download_size: 99575780
dataset_size: 119417305.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nbpablom/shallty | ---
license: other
---
|
open-llm-leaderboard/details_WizardLM__WizardMath-7B-V1.1 | ---
pretty_name: Evaluation run of WizardLM/WizardMath-7B-V1.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [WizardLM/WizardMath-7B-V1.1](https://huggingface.co/WizardLM/WizardMath-7B-V1.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_WizardLM__WizardMath-7B-V1.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-20T21:22:26.878965](https://huggingface.co/datasets/open-llm-leaderboard/details_WizardLM__WizardMath-7B-V1.1/blob/main/results_2023-12-20T21-22-26.878965.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6189363761315251,\n\
\ \"acc_stderr\": 0.032618810440506206,\n \"acc_norm\": 0.6192370527292648,\n\
\ \"acc_norm_stderr\": 0.03328320019631228,\n \"mc1\": 0.32802937576499386,\n\
\ \"mc1_stderr\": 0.016435632932815025,\n \"mc2\": 0.47044548067060826,\n\
\ \"mc2_stderr\": 0.015719256312305734\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5750853242320819,\n \"acc_stderr\": 0.014445698968520767,\n\
\ \"acc_norm\": 0.6186006825938567,\n \"acc_norm_stderr\": 0.014194389086685247\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6611232822146983,\n\
\ \"acc_stderr\": 0.004723605376936913,\n \"acc_norm\": 0.8449512049392551,\n\
\ \"acc_norm_stderr\": 0.003612114670698977\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.039105257528497236,\n\
\ \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.039105257528497236\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.028815615713432115,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.028815615713432115\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n\
\ \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n\
\ \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\
\ \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n\
\ \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.047240073523838876,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.047240073523838876\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.03265019475033582,\n\
\ \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.03265019475033582\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728762,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728762\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.025305906241590632,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.025305906241590632\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7258064516129032,\n\
\ \"acc_stderr\": 0.025378139970885196,\n \"acc_norm\": 0.7258064516129032,\n\
\ \"acc_norm_stderr\": 0.025378139970885196\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198896,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198896\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709443,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6025641025641025,\n \"acc_stderr\": 0.024811920017903836,\n\
\ \"acc_norm\": 0.6025641025641025,\n \"acc_norm_stderr\": 0.024811920017903836\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251976,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251976\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5840336134453782,\n \"acc_stderr\": 0.032016501007396114,\n\
\ \"acc_norm\": 0.5840336134453782,\n \"acc_norm_stderr\": 0.032016501007396114\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8018348623853211,\n \"acc_stderr\": 0.017090573804217905,\n \"\
acc_norm\": 0.8018348623853211,\n \"acc_norm_stderr\": 0.017090573804217905\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.03350991604696043,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.03350991604696043\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967407,\n \"\
acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967407\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477518,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477518\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596915,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596915\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077805,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077805\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n\
\ \"acc_stderr\": 0.014108533515757431,\n \"acc_norm\": 0.8071519795657727,\n\
\ \"acc_norm_stderr\": 0.014108533515757431\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258165,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258165\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3452513966480447,\n\
\ \"acc_stderr\": 0.01590143260893035,\n \"acc_norm\": 0.3452513966480447,\n\
\ \"acc_norm_stderr\": 0.01590143260893035\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275749,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275749\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6688102893890675,\n\
\ \"acc_stderr\": 0.026730620728004906,\n \"acc_norm\": 0.6688102893890675,\n\
\ \"acc_norm_stderr\": 0.026730620728004906\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n\
\ \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370586,\n \
\ \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370586\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43546284224250326,\n\
\ \"acc_stderr\": 0.012663412101248335,\n \"acc_norm\": 0.43546284224250326,\n\
\ \"acc_norm_stderr\": 0.012663412101248335\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.02934980313976587,\n\
\ \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.02934980313976587\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.02904308868330433,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.02904308868330433\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n\
\ \"acc_stderr\": 0.027962677604768914,\n \"acc_norm\": 0.8059701492537313,\n\
\ \"acc_norm_stderr\": 0.027962677604768914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32802937576499386,\n\
\ \"mc1_stderr\": 0.016435632932815025,\n \"mc2\": 0.47044548067060826,\n\
\ \"mc2_stderr\": 0.015719256312305734\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7734806629834254,\n \"acc_stderr\": 0.011764149054698338\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6739954510993177,\n \
\ \"acc_stderr\": 0.012911675645682845\n }\n}\n```"
repo_url: https://huggingface.co/WizardLM/WizardMath-7B-V1.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|arc:challenge|25_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|gsm8k|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hellaswag|10_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-20T21-22-26.878965.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-20T21-22-26.878965.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- '**/details_harness|winogrande|5_2023-12-20T21-22-26.878965.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-20T21-22-26.878965.parquet'
- config_name: results
data_files:
- split: 2023_12_20T21_22_26.878965
path:
- results_2023-12-20T21-22-26.878965.parquet
- split: latest
path:
- results_2023-12-20T21-22-26.878965.parquet
---
# Dataset Card for Evaluation run of WizardLM/WizardMath-7B-V1.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [WizardLM/WizardMath-7B-V1.1](https://huggingface.co/WizardLM/WizardMath-7B-V1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_WizardLM__WizardMath-7B-V1.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-20T21:22:26.878965](https://huggingface.co/datasets/open-llm-leaderboard/details_WizardLM__WizardMath-7B-V1.1/blob/main/results_2023-12-20T21-22-26.878965.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6189363761315251,
"acc_stderr": 0.032618810440506206,
"acc_norm": 0.6192370527292648,
"acc_norm_stderr": 0.03328320019631228,
"mc1": 0.32802937576499386,
"mc1_stderr": 0.016435632932815025,
"mc2": 0.47044548067060826,
"mc2_stderr": 0.015719256312305734
},
"harness|arc:challenge|25": {
"acc": 0.5750853242320819,
"acc_stderr": 0.014445698968520767,
"acc_norm": 0.6186006825938567,
"acc_norm_stderr": 0.014194389086685247
},
"harness|hellaswag|10": {
"acc": 0.6611232822146983,
"acc_stderr": 0.004723605376936913,
"acc_norm": 0.8449512049392551,
"acc_norm_stderr": 0.003612114670698977
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.039105257528497236,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.039105257528497236
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.028815615713432115,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.028815615713432115
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.037738099906869334,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.037738099906869334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.047240073523838876,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.047240073523838876
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728762,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728762
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.025305906241590632,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.025305906241590632
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7258064516129032,
"acc_stderr": 0.025378139970885196,
"acc_norm": 0.7258064516129032,
"acc_norm_stderr": 0.025378139970885196
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198896,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198896
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709443,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6025641025641025,
"acc_stderr": 0.024811920017903836,
"acc_norm": 0.6025641025641025,
"acc_norm_stderr": 0.024811920017903836
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251976,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251976
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5840336134453782,
"acc_stderr": 0.032016501007396114,
"acc_norm": 0.5840336134453782,
"acc_norm_stderr": 0.032016501007396114
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8018348623853211,
"acc_stderr": 0.017090573804217905,
"acc_norm": 0.8018348623853211,
"acc_norm_stderr": 0.017090573804217905
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.03350991604696043,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.03350991604696043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967407,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967407
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477518,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477518
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596915,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596915
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077805,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077805
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757431,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757431
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258165,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258165
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3452513966480447,
"acc_stderr": 0.01590143260893035,
"acc_norm": 0.3452513966480447,
"acc_norm_stderr": 0.01590143260893035
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275749,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275749
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6688102893890675,
"acc_stderr": 0.026730620728004906,
"acc_norm": 0.6688102893890675,
"acc_norm_stderr": 0.026730620728004906
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370586,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370586
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43546284224250326,
"acc_stderr": 0.012663412101248335,
"acc_norm": 0.43546284224250326,
"acc_norm_stderr": 0.012663412101248335
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.02904308868330433,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.02904308868330433
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768914,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32802937576499386,
"mc1_stderr": 0.016435632932815025,
"mc2": 0.47044548067060826,
"mc2_stderr": 0.015719256312305734
},
"harness|winogrande|5": {
"acc": 0.7734806629834254,
"acc_stderr": 0.011764149054698338
},
"harness|gsm8k|5": {
"acc": 0.6739954510993177,
"acc_stderr": 0.012911675645682845
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Vitorbr2009/Afanaubeto | ---
license: openrail
---
|
roa7n/patched_1000_test_p_100_m2_predictions | ---
dataset_info:
features:
- name: id
dtype: string
- name: sequence_str
dtype: string
- name: label
dtype: int64
- name: features
sequence: float64
- name: m2_preds
dtype: float32
splits:
- name: train
num_bytes: 5874386840
num_examples: 659861
download_size: 5594068699
dataset_size: 5874386840
---
# Dataset Card for "patched_1000_test_p_100_m2_predictions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sultu/Lord_x | ---
license: openrail
---
|
rwkv-x-dev/openorca-gpt4 | ---
pretty_name: OpenOrca
configs:
- config_name: default
default: true
data_files:
- split: train
path:
- "*.parquet"
---
OpenOrca but just the GPT4 bits. |
open-llm-leaderboard/details_Yukang__Llama-2-13b-longlora-32k-ft | ---
pretty_name: Evaluation run of Yukang/Llama-2-13b-longlora-32k-ft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Yukang/Llama-2-13b-longlora-32k-ft](https://huggingface.co/Yukang/Llama-2-13b-longlora-32k-ft)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Yukang__Llama-2-13b-longlora-32k-ft\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T02:49:42.173825](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__Llama-2-13b-longlora-32k-ft/blob/main/results_2023-10-28T02-49-42.173825.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0010486577181208054,\n\
\ \"em_stderr\": 0.0003314581465219189,\n \"f1\": 0.05492764261744986,\n\
\ \"f1_stderr\": 0.0012887827966655012,\n \"acc\": 0.4163294284912454,\n\
\ \"acc_stderr\": 0.009719919588691044\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.0003314581465219189,\n\
\ \"f1\": 0.05492764261744986,\n \"f1_stderr\": 0.0012887827966655012\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07733131159969674,\n \
\ \"acc_stderr\": 0.00735771352322235\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.755327545382794,\n \"acc_stderr\": 0.012082125654159738\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Yukang/Llama-2-13b-longlora-32k-ft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|arc:challenge|25_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T02_49_42.173825
path:
- '**/details_harness|drop|3_2023-10-28T02-49-42.173825.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T02-49-42.173825.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T02_49_42.173825
path:
- '**/details_harness|gsm8k|5_2023-10-28T02-49-42.173825.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T02-49-42.173825.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hellaswag|10_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T02_49_42.173825
path:
- '**/details_harness|winogrande|5_2023-10-28T02-49-42.173825.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T02-49-42.173825.parquet'
- config_name: results
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- results_2023-10-10T13-26-13.835261.parquet
- split: 2023_10_28T02_49_42.173825
path:
- results_2023-10-28T02-49-42.173825.parquet
- split: latest
path:
- results_2023-10-28T02-49-42.173825.parquet
---
# Dataset Card for Evaluation run of Yukang/Llama-2-13b-longlora-32k-ft
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Yukang/Llama-2-13b-longlora-32k-ft
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Yukang/Llama-2-13b-longlora-32k-ft](https://huggingface.co/Yukang/Llama-2-13b-longlora-32k-ft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Yukang__Llama-2-13b-longlora-32k-ft",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T02:49:42.173825](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__Llama-2-13b-longlora-32k-ft/blob/main/results_2023-10-28T02-49-42.173825.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0010486577181208054,
"em_stderr": 0.0003314581465219189,
"f1": 0.05492764261744986,
"f1_stderr": 0.0012887827966655012,
"acc": 0.4163294284912454,
"acc_stderr": 0.009719919588691044
},
"harness|drop|3": {
"em": 0.0010486577181208054,
"em_stderr": 0.0003314581465219189,
"f1": 0.05492764261744986,
"f1_stderr": 0.0012887827966655012
},
"harness|gsm8k|5": {
"acc": 0.07733131159969674,
"acc_stderr": 0.00735771352322235
},
"harness|winogrande|5": {
"acc": 0.755327545382794,
"acc_stderr": 0.012082125654159738
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
amay01/llm-sgd-dst8-training-data | ---
dataset_info:
features:
- name: output
dtype: string
- name: input
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 62883134
num_examples: 175780
download_size: 10265723
dataset_size: 62883134
---
# Dataset Card for "llm-sgd-dst8-training-data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Dhika/defect_rail | ---
license: unknown
---
|
joefox/Russian_LibriSpeech_RuLS_test_noise | ---
license: apache-2.0
---
### Dataset Summary
Augmented part of the test data of the Russian LibriSpeech (RuLS) test part dataset.
As a basis, the original part of the test was taken, and augmentation was carried out to add extraneous noise.
Part dataset: test
|
autoevaluate/autoeval-staging-eval-project-d60b4e7e-7574887 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xtreme
eval_info:
task: entity_extraction
model: Ninh/xlm-roberta-base-finetuned-panx-de
metrics: []
dataset_name: xtreme
dataset_config: PAN-X.de
dataset_split: test
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: Ninh/xlm-roberta-base-finetuned-panx-de
* Dataset: xtreme
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
andersonbcdefg/inpars_sample_triples | ---
dataset_info:
features:
- name: query
dtype: string
- name: pos
dtype: string
- name: neg
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 536694326
num_examples: 255000
download_size: 304690252
dataset_size: 536694326
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jerteh/SrpKor4Tagging | ---
license: cc-by-sa-4.0
task_categories:
- token-classification
language:
- sr
pretty_name: SrpKor4Tagging training dataset
size_categories:
- 100K<n<1M
---
Corpus is created via mix of literary (⅓) and administrative (⅔) texts in Serbian.
It is tagged for POS for 2 tagsets: Universal POS tagset and SrpLemKor tagset (made according to traditional, descriptive Serbian grammar) and lemmatized
It is constituted of a single jsonl file that can be loaded via:
```python
from datasets import load_dataset
dataset = load_dataset("jerteh/SrpKor4Tagging")
```
Preview:
```python
ds = dataset["train"][1389]
for x, y, z in zip(ds["token"], ds["ud"], ds["lemma"]):
print(x, y, z)
Okrugle ADJ okrugao
mongolske ADJ mongolski
fizionomije NOUN fizionomija
behu AUX biti
ustupile VERB ustupiti
mesto NOUN mesto
licima NOUN lice
evropskijeg ADJ evropski
tipa NOUN tip
, PUNCT ,
prljavim ADJ prljav
, PUNCT ,
obradatelim ADJ obradateo
i CCONJ i
iscrpenim ADJ iscrpen
. PUNCT .
```
Citation:
```bibtex
@inproceedings{stankovic-etal-2020-machine,
title = "Machine Learning and Deep Neural Network-Based Lemmatization and Morphosyntactic Tagging for {S}erbian",
author = "Stankovic, Ranka and
{\v{S}}andrih, Branislava and
Krstev, Cvetana and
Utvi{\'c}, Milo{\v{s}} and
Skoric, Mihailo",
editor = "Calzolari, Nicoletta and
B{\'e}chet, Fr{\'e}d{\'e}ric and
Blache, Philippe and
Choukri, Khalid and
Cieri, Christopher and
Declerck, Thierry and
Goggi, Sara and
Isahara, Hitoshi and
Maegaard, Bente and
Mariani, Joseph and
Mazo, H{\'e}l{\`e}ne and
Moreno, Asuncion and
Odijk, Jan and
Piperidis, Stelios",
booktitle = "Proceedings of the Twelfth Language Resources and Evaluation Conference",
month = may,
year = "2020",
address = "Marseille, France",
publisher = "European Language Resources Association",
url = "https://aclanthology.org/2020.lrec-1.487",
pages = "3954--3962",
abstract = "The training of new tagger models for Serbian is primarily motivated by the enhancement of the existing tagset with the grammatical category of a gender. The harmonization of resources that were manually annotated within different projects over a long period of time was an important task, enabled by the development of tools that support partial automation. The supporting tools take into account different taggers and tagsets. This paper focuses on TreeTagger and spaCy taggers, and the annotation schema alignment between Serbian morphological dictionaries, MULTEXT-East and Universal Part-of-Speech tagset. The trained models will be used to publish the new version of the Corpus of Contemporary Serbian as well as the Serbian literary corpus. The performance of developed taggers were compared and the impact of training set size was investigated, which resulted in around 98{\%} PoS-tagging precision per token for both new models. The sr{\_}basic annotated dataset will also be published.",
language = "English",
ISBN = "979-10-95546-34-4",
}
```
|
gsarti/flores_101 | ---
annotations_creators:
- found
language_creators:
- expert-generated
language:
- af
- am
- ar
- hy
- as
- ast
- az
- be
- bn
- bs
- bg
- my
- ca
- ceb
- zho
- hr
- cs
- da
- nl
- en
- et
- tl
- fi
- fr
- ff
- gl
- lg
- ka
- de
- el
- gu
- ha
- he
- hi
- hu
- is
- ig
- id
- ga
- it
- ja
- jv
- kea
- kam
- kn
- kk
- km
- ko
- ky
- lo
- lv
- ln
- lt
- luo
- lb
- mk
- ms
- ml
- mt
- mi
- mr
- mn
- ne
- ns
- 'no'
- ny
- oc
- or
- om
- ps
- fa
- pl
- pt
- pa
- ro
- ru
- sr
- sn
- sd
- sk
- sl
- so
- ku
- es
- sw
- sv
- tg
- ta
- te
- th
- tr
- uk
- umb
- ur
- uz
- vi
- cy
- wo
- xh
- yo
- zu
license:
- cc-by-sa-4.0
multilinguality:
- multilingual
- translation
size_categories:
- unknown
source_datasets:
- extended|flores
task_categories:
- text-generation
- translation
task_ids: []
paperswithcode_id: flores
pretty_name: flores101
tags:
- conditional-text-generation
---
# Dataset Card for Flores 101
## Table of Contents
- [Dataset Card for Flores 101](#dataset-card-for-flores-101)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Home:** [WMT](http://www.statmt.org/wmt21/large-scale-multilingual-translation-task.html)
- **Repository:** [Github](https://github.com/facebookresearch/flores)
- **Blogpost:** [FAIR](https://ai.facebook.com/blog/the-flores-101-data-set-helping-build-better-translation-systems-around-the-world)
- **Paper:** [Arxiv](https://arxiv.org/abs/2106.03193)
- **Point of Contact:** [flores@fb.com](mailto:flores@fb.com)
- **Leaderboard** [Dynabench](https://dynabench.org/flores/Flores%20MT%20Evaluation%20(FULL))
### Dataset Summary
FLORES is a benchmark dataset for machine translation between English and low-resource languages.
Abstract from the original paper:
> One of the biggest challenges hindering progress in low-resource and multilingual machine translation is the lack of good evaluation benchmarks. Current evaluation benchmarks either lack good coverage of low-resource languages, consider only restricted domains, or are low quality because they are constructed using semi-automatic procedures. In this work, we introduce the FLORES evaluation benchmark, consisting of 3001 sentences extracted from English Wikipedia and covering a variety of different topics and domains. These sentences have been translated in 101 languages by professional translators through a carefully controlled process. The resulting dataset enables better assessment of model quality on the long tail of low-resource languages, including the evaluation of many-to-many multilingual translation systems, as all translations are multilingually aligned. By publicly releasing such a high-quality and high-coverage dataset, we hope to foster progress in the machine translation community and beyond.
**Disclaimer**: *The Flores-101 dataset is hosted by the Facebook and licensed under the [Creative Commons Attribution-ShareAlike 4.0 International License](https://creativecommons.org/licenses/by-sa/4.0/).
### Supported Tasks and Leaderboards
#### Multilingual Machine Translation
Refer to the [Dynabench leaderboard](https://dynabench.org/flores/Flores%20MT%20Evaluation%20(FULL)) for additional details on model evaluation on FLORES-101 in the context of the WMT2021 shared task on [Large-Scale Multilingual Machine Translation](http://www.statmt.org/wmt21/large-scale-multilingual-translation-task.html).
### Languages
The dataset contains parallel sentences for 101 languages, as mentioned in the original [Github](https://github.com/facebookresearch/flores/blob/master/README.md) page for the project. Languages are identified with the ISO 639-3 code (e.g. `eng`, `fra`, `rus`) as in the original dataset.
**New:** Use the configuration `all` to access the full set of parallel sentences for all the available languages in a single command.
## Dataset Structure
### Data Instances
A sample from the `dev` split for the Russian language (`rus` config) is provided below. All configurations have the same structure, and all sentences are aligned across configurations and splits.
```python
{
'id': 1,
'sentence': 'В понедельник ученые из Медицинской школы Стэнфордского университета объявили об изобретении нового диагностического инструмента, который может сортировать клетки по их типу; это маленький чип, который можно напечатать, используя стандартный струйный принтер примерно за 1 цент США.',
'URL': 'https://en.wikinews.org/wiki/Scientists_say_new_medical_diagnostic_chip_can_sort_cells_anywhere_with_an_inkjet',
'domain': 'wikinews',
'topic': 'health',
'has_image': 0,
'has_hyperlink': 0
}
```
The text is provided as-in the original dataset, without further preprocessing or tokenization.
### Data Fields
- `id`: Row number for the data entry, starting at 1.
- `sentence`: The full sentence in the specific language.
- `URL`: The URL for the English article from which the sentence was extracted.
- `domain`: The domain of the sentence.
- `topic`: The topic of the sentence.
- `has_image`: Whether the original article contains an image.
- `has_hyperlink`: Whether the sentence contains a hyperlink.
### Data Splits
| config| `dev`| `devtest`|
|-----------------:|-----:|---------:|
|all configurations| 997| 1012:|
### Dataset Creation
Please refer to the original article [The FLORES-101 Evaluation Benchmark for Low-Resource and Multilingual Machine Translation](https://arxiv.org/abs/2106.03193) for additional information on dataset creation.
## Additional Information
### Dataset Curators
The original authors of FLORES-101 are the curators of the original dataset. For problems or updates on this 🤗 Datasets version, please contact [gabriele.sarti996@gmail.com](mailto:gabriele.sarti996@gmail.com).
### Licensing Information
Licensed with Creative Commons Attribution Share Alike 4.0. License available [here](https://creativecommons.org/licenses/by-sa/4.0/).
### Citation Information
Please cite the authors if you use these corpora in your work:
```bibtex
@inproceedings{flores101,
title={The FLORES-101 Evaluation Benchmark for Low-Resource and Multilingual Machine Translation},
author={Goyal, Naman and Gao, Cynthia and Chaudhary, Vishrav and Chen, Peng-Jen and Wenzek, Guillaume and Ju, Da and Krishnan, Sanjana and Ranzato, Marc'Aurelio and Guzm\'{a}n, Francisco and Fan, Angela},
journal={arXiv preprint arXiv:2106.03193},
year={2021}
}
``` |
open-llm-leaderboard/details_lgaalves__gpt2_open-platypus | ---
pretty_name: Evaluation run of lgaalves/gpt2_open-platypus
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lgaalves/gpt2_open-platypus](https://huggingface.co/lgaalves/gpt2_open-platypus)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lgaalves__gpt2_open-platypus\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-15T13:45:26.230063](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__gpt2_open-platypus/blob/main/results_2023-10-15T13-45-26.230063.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001363255033557047,\n\
\ \"em_stderr\": 0.00037786091964607695,\n \"f1\": 0.04636010906040263,\n\
\ \"f1_stderr\": 0.0012972722820894797,\n \"acc\": 0.25726959447047076,\n\
\ \"acc_stderr\": 0.007559748871273466\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001363255033557047,\n \"em_stderr\": 0.00037786091964607695,\n\
\ \"f1\": 0.04636010906040263,\n \"f1_stderr\": 0.0012972722820894797\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.001516300227445034,\n \
\ \"acc_stderr\": 0.0010717793485492632\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5130228887134964,\n \"acc_stderr\": 0.01404771839399767\n\
\ }\n}\n```"
repo_url: https://huggingface.co/lgaalves/gpt2_open-platypus
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|arc:challenge|25_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_15T13_45_26.230063
path:
- '**/details_harness|drop|3_2023-10-15T13-45-26.230063.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-15T13-45-26.230063.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_15T13_45_26.230063
path:
- '**/details_harness|gsm8k|5_2023-10-15T13-45-26.230063.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-15T13-45-26.230063.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hellaswag|10_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T17:11:08.445217.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T17:11:08.445217.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T17:11:08.445217.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_15T13_45_26.230063
path:
- '**/details_harness|winogrande|5_2023-10-15T13-45-26.230063.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-15T13-45-26.230063.parquet'
- config_name: results
data_files:
- split: 2023_08_31T17_11_08.445217
path:
- results_2023-08-31T17:11:08.445217.parquet
- split: 2023_10_15T13_45_26.230063
path:
- results_2023-10-15T13-45-26.230063.parquet
- split: latest
path:
- results_2023-10-15T13-45-26.230063.parquet
---
# Dataset Card for Evaluation run of lgaalves/gpt2_open-platypus
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lgaalves/gpt2_open-platypus
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lgaalves/gpt2_open-platypus](https://huggingface.co/lgaalves/gpt2_open-platypus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lgaalves__gpt2_open-platypus",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T13:45:26.230063](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__gpt2_open-platypus/blob/main/results_2023-10-15T13-45-26.230063.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001363255033557047,
"em_stderr": 0.00037786091964607695,
"f1": 0.04636010906040263,
"f1_stderr": 0.0012972722820894797,
"acc": 0.25726959447047076,
"acc_stderr": 0.007559748871273466
},
"harness|drop|3": {
"em": 0.001363255033557047,
"em_stderr": 0.00037786091964607695,
"f1": 0.04636010906040263,
"f1_stderr": 0.0012972722820894797
},
"harness|gsm8k|5": {
"acc": 0.001516300227445034,
"acc_stderr": 0.0010717793485492632
},
"harness|winogrande|5": {
"acc": 0.5130228887134964,
"acc_stderr": 0.01404771839399767
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
wolinski/skladnica_demo | ---
license: cc-by-4.0
---
|
yuvalkirstain/pexel_people | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: generated_caption
dtype: string
splits:
- name: train
num_bytes: 5374411376.0
num_examples: 15994
download_size: 3908548281
dataset_size: 5374411376.0
---
# Dataset Card for "pexel_people"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
smangrul/ad-copy-generation | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: content
dtype: string
splits:
- name: train
num_bytes: 445199.82471516216
num_examples: 1000
- name: test
num_bytes: 62773.17528483786
num_examples: 141
download_size: 194198
dataset_size: 507973.0
---
# Dataset Card for "ad-copy-generation"
Formatted the dataset https://huggingface.co/datasets/jaykin01/advertisement-copy to follow the Llama V2 chat template for instruction tuning.
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
domserrea/ebay_productlisting | ---
license: cc
---
|
Augusto777/prueba-dmae | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': avanzada
'1': leve
'2': moderada
'3': no dmae
splits:
- name: train
num_bytes: 8067683.0
num_examples: 986
- name: test
num_bytes: 21587702.0
num_examples: 60
- name: validation
num_bytes: 24670569.0
num_examples: 60
download_size: 53942285
dataset_size: 54325954.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
benj3037/neuro_patents_benj | ---
dataset_info:
features:
- name: appln_id
dtype: int64
- name: appln_filing_date
dtype: string
- name: docdb_family_id
dtype: int64
- name: granted
dtype: string
- name: appln_abstract
dtype: string
- name: appln_abstract_lg
dtype: string
- name: appln_title
dtype: string
- name: applt_coun
dtype: string
- name: invt_coun
dtype: string
- name: cpc
dtype: string
- name: ipc
sequence: string
- name: __index_level_0__
dtype: int64
- name: input
dtype: string
- name: completion
dtype: string
splits:
- name: train
num_bytes: 13214.4
num_examples: 6
download_size: 29577
dataset_size: 13214.4
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_saishf__Top-Western-Maid-7B | ---
pretty_name: Evaluation run of saishf/Top-Western-Maid-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [saishf/Top-Western-Maid-7B](https://huggingface.co/saishf/Top-Western-Maid-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_saishf__Top-Western-Maid-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-13T11:07:35.441841](https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__Top-Western-Maid-7B/blob/main/results_2024-02-13T11-07-35.441841.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6506958953045886,\n\
\ \"acc_stderr\": 0.03211223802352257,\n \"acc_norm\": 0.6509594125199996,\n\
\ \"acc_norm_stderr\": 0.032774521752530913,\n \"mc1\": 0.42472460220318237,\n\
\ \"mc1_stderr\": 0.017304000957167477,\n \"mc2\": 0.5879284610844989,\n\
\ \"mc2_stderr\": 0.015340978033780782\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6638225255972696,\n \"acc_stderr\": 0.013804855026205763,\n\
\ \"acc_norm\": 0.6936860068259386,\n \"acc_norm_stderr\": 0.013470584417276513\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6974706233817964,\n\
\ \"acc_stderr\": 0.004584144014654942,\n \"acc_norm\": 0.8740290778729337,\n\
\ \"acc_norm_stderr\": 0.003311384498158642\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\"\
: 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n\
\ \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n\
\ \"acc_stderr\": 0.04951218252396262,\n \"acc_norm\": 0.45098039215686275,\n\
\ \"acc_norm_stderr\": 0.04951218252396262\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5914893617021276,\n\
\ \"acc_stderr\": 0.032134180267015755,\n \"acc_norm\": 0.5914893617021276,\n\
\ \"acc_norm_stderr\": 0.032134180267015755\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.5350877192982456,\n \"acc_stderr\": 0.046920083813689104,\n\
\ \"acc_norm\": 0.5350877192982456,\n \"acc_norm_stderr\": 0.046920083813689104\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\"\
: 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n\
\ \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.025424835086924,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.025424835086924\n },\n \"harness|hendrycksTest-formal_logic|5\"\
: {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.04451807959055328,\n\
\ \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.04451807959055328\n\
\ },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n\
\ \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n\
\ \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"\
acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790486,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790486\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593552,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593552\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \
\ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078966,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078966\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281372,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281372\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n\
\ \"acc_stderr\": 0.013428186370608303,\n \"acc_norm\": 0.8301404853128991,\n\
\ \"acc_norm_stderr\": 0.013428186370608303\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3653631284916201,\n\
\ \"acc_stderr\": 0.01610483388014229,\n \"acc_norm\": 0.3653631284916201,\n\
\ \"acc_norm_stderr\": 0.01610483388014229\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.02540383297817961,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.02540383297817961\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.02389187954195961,\n\
\ \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.02389187954195961\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4661016949152542,\n\
\ \"acc_stderr\": 0.012740853872949832,\n \"acc_norm\": 0.4661016949152542,\n\
\ \"acc_norm_stderr\": 0.012740853872949832\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \
\ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42472460220318237,\n\
\ \"mc1_stderr\": 0.017304000957167477,\n \"mc2\": 0.5879284610844989,\n\
\ \"mc2_stderr\": 0.015340978033780782\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8326756116811366,\n \"acc_stderr\": 0.010490608806828075\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6595905989385898,\n \
\ \"acc_stderr\": 0.013052097103299104\n }\n}\n```"
repo_url: https://huggingface.co/saishf/Top-Western-Maid-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|arc:challenge|25_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|gsm8k|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hellaswag|10_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T11-07-35.441841.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-13T11-07-35.441841.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- '**/details_harness|winogrande|5_2024-02-13T11-07-35.441841.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-13T11-07-35.441841.parquet'
- config_name: results
data_files:
- split: 2024_02_13T11_07_35.441841
path:
- results_2024-02-13T11-07-35.441841.parquet
- split: latest
path:
- results_2024-02-13T11-07-35.441841.parquet
---
# Dataset Card for Evaluation run of saishf/Top-Western-Maid-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [saishf/Top-Western-Maid-7B](https://huggingface.co/saishf/Top-Western-Maid-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_saishf__Top-Western-Maid-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T11:07:35.441841](https://huggingface.co/datasets/open-llm-leaderboard/details_saishf__Top-Western-Maid-7B/blob/main/results_2024-02-13T11-07-35.441841.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6506958953045886,
"acc_stderr": 0.03211223802352257,
"acc_norm": 0.6509594125199996,
"acc_norm_stderr": 0.032774521752530913,
"mc1": 0.42472460220318237,
"mc1_stderr": 0.017304000957167477,
"mc2": 0.5879284610844989,
"mc2_stderr": 0.015340978033780782
},
"harness|arc:challenge|25": {
"acc": 0.6638225255972696,
"acc_stderr": 0.013804855026205763,
"acc_norm": 0.6936860068259386,
"acc_norm_stderr": 0.013470584417276513
},
"harness|hellaswag|10": {
"acc": 0.6974706233817964,
"acc_stderr": 0.004584144014654942,
"acc_norm": 0.8740290778729337,
"acc_norm_stderr": 0.003311384498158642
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.035868792800803406,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.035868792800803406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.04951218252396262,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.04951218252396262
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5350877192982456,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.5350877192982456,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.04451807959055328,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.04451807959055328
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593552,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593552
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078966,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078966
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281372,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608303,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608303
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3653631284916201,
"acc_stderr": 0.01610483388014229,
"acc_norm": 0.3653631284916201,
"acc_norm_stderr": 0.01610483388014229
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.02540383297817961,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.02540383297817961
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.02389187954195961,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.02389187954195961
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4661016949152542,
"acc_stderr": 0.012740853872949832,
"acc_norm": 0.4661016949152542,
"acc_norm_stderr": 0.012740853872949832
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.01904748523936038,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.01904748523936038
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42472460220318237,
"mc1_stderr": 0.017304000957167477,
"mc2": 0.5879284610844989,
"mc2_stderr": 0.015340978033780782
},
"harness|winogrande|5": {
"acc": 0.8326756116811366,
"acc_stderr": 0.010490608806828075
},
"harness|gsm8k|5": {
"acc": 0.6595905989385898,
"acc_stderr": 0.013052097103299104
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Leyhtalas/Akshan | ---
license: openrail
---
|
Tngarg/russian_train | ---
dataset_info:
features:
- name: sentiment
dtype: string
- name: tweet
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 315825
num_examples: 1040
download_size: 174896
dataset_size: 315825
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "russian_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/Microphone_Collecting_Radio_Frequency_Noise_Data | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Microphone_Collecting_Radio_Frequency_Noise_Data
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/34?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
The data is collected in 66 rooms, 2-4 point locations in each room. According to the relative position of the sound source and the point, 2-5 sets of data are collected for each point. The valid time is 20 hours. The data is recorded in a wide range and can be used for smart home scene product development.
For more details, please refer to the link: https://www.nexdata.ai/datasets/34?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition,noisy-speech-recognition: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
Noise data
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions |
Kisu-2003/dataset_ai_earth_hackthon | ---
license: apache-2.0
task_categories:
- table-question-answering
tags:
- climate
size_categories:
- n<1K
---
t. The dataset contains circular economy business ideas that come in
problem-solution pairs. Participants were asked about the problem their solution is meant to
solve and describe the solution in their own words |
gayanin/babylon-native-v8-vocab-noised | ---
dataset_info:
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 693080
num_examples: 3893
- name: test
num_bytes: 74246
num_examples: 487
- name: validation
num_bytes: 79131
num_examples: 487
download_size: 483474
dataset_size: 846457
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
mickume/alt_fantasy | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 653833080
num_examples: 3488817
download_size: 402691207
dataset_size: 653833080
---
# Dataset Card for "alt_fantasy"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
UnderstandLing/oasst1_nl_threads_dpo | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 31354831
num_examples: 23888
- name: validation
num_bytes: 1711827
num_examples: 1234
download_size: 13268460
dataset_size: 33066658
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
Arnaldo34/Myvoice4 | ---
license: openrail
---
|
CyberHarem/p22_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of p22/P22/P22 (Girls' Frontline)
This is the dataset of p22/P22/P22 (Girls' Frontline), containing 25 images and their tags.
The core tags of this character are `blue_eyes, short_hair, bangs, breasts, hair_between_eyes, black_hair, earrings, grey_hair, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 25 | 25.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p22_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 25 | 16.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p22_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 49 | 29.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p22_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 25 | 23.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p22_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 49 | 39.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p22_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/p22_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | looking_at_viewer, solo, 1girl, blue_jacket, cleavage, navel, black_shorts, black_thighhighs, blush, checkered_flag, fingerless_gloves, full_body, highleg_panties, race_queen, short_shorts, bikini, headset, high_heels, official_alternate_costume, sitting, thigh_boots, black_gloves, blue_panties, collarbone, cropped_jacket, holding_flag, open_clothes, smile |
| 1 | 18 |  |  |  |  |  | 1girl, solo, looking_at_viewer, jewelry, smile, bare_shoulders, jacket, closed_mouth, sleeveless, black_nails, handgun, holding_gun, long_sleeves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | looking_at_viewer | solo | 1girl | blue_jacket | cleavage | navel | black_shorts | black_thighhighs | blush | checkered_flag | fingerless_gloves | full_body | highleg_panties | race_queen | short_shorts | bikini | headset | high_heels | official_alternate_costume | sitting | thigh_boots | black_gloves | blue_panties | collarbone | cropped_jacket | holding_flag | open_clothes | smile | jewelry | bare_shoulders | jacket | closed_mouth | sleeveless | black_nails | handgun | holding_gun | long_sleeves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------|:-------|:--------|:--------------|:-----------|:--------|:---------------|:-------------------|:--------|:-----------------|:--------------------|:------------|:------------------|:-------------|:---------------|:---------|:----------|:-------------|:-----------------------------|:----------|:--------------|:---------------|:---------------|:-------------|:-----------------|:---------------|:---------------|:--------|:----------|:-----------------|:---------|:---------------|:-------------|:--------------|:----------|:--------------|:---------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 1 | 18 |  |  |  |  |  | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
|
irds/mr-tydi_ru | ---
pretty_name: '`mr-tydi/ru`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `mr-tydi/ru`
The `mr-tydi/ru` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/mr-tydi#mr-tydi/ru).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=9,597,504
- `queries` (i.e., topics); count=7,763
- `qrels`: (relevance assessments); count=7,909
This dataset is used by: [`mr-tydi_ru_dev`](https://huggingface.co/datasets/irds/mr-tydi_ru_dev), [`mr-tydi_ru_test`](https://huggingface.co/datasets/irds/mr-tydi_ru_test), [`mr-tydi_ru_train`](https://huggingface.co/datasets/irds/mr-tydi_ru_train)
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/mr-tydi_ru', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ...}
queries = load_dataset('irds/mr-tydi_ru', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/mr-tydi_ru', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Zhang2021MrTyDi,
title={{Mr. TyDi}: A Multi-lingual Benchmark for Dense Retrieval},
author={Xinyu Zhang and Xueguang Ma and Peng Shi and Jimmy Lin},
year={2021},
journal={arXiv:2108.08787},
}
@article{Clark2020TyDiQa,
title={{TyDi QA}: A Benchmark for Information-Seeking Question Answering in Typologically Diverse Languages},
author={Jonathan H. Clark and Eunsol Choi and Michael Collins and Dan Garrette and Tom Kwiatkowski and Vitaly Nikolaev and Jennimaria Palomaki},
year={2020},
journal={Transactions of the Association for Computational Linguistics}
}
```
|
nlpso/m1_fine_tuning_ocr_ptrn_cmbert_iob2 | ---
language:
- fr
multilinguality:
- monolingual
task_categories:
- token-classification
---
# m1_fine_tuning_ocr_ptrn_cmbert_iob2
## Introduction
This dataset was used to fine-tuned [HueyNemud/das22-10-camembert_pretrained](https://huggingface.co/HueyNemud/das22-10-camembert_pretrained) for **nested NER task** using Independant NER layers approach [M1].
It contains Paris trade directories entries from the 19th century.
## Dataset parameters
* Approach : M1
* Dataset type : noisy (Pero OCR)
* Tokenizer : [HueyNemud/das22-10-camembert_pretrained](https://huggingface.co/HueyNemud/das22-10-camembert_pretrained)
* Tagging format : IOB2
* Counts :
* Train : 6084
* Dev : 676
* Test : 1685
* Associated fine-tuned models :
* Level-1 : [nlpso/m1_ind_layers_ocr_ptrn_cmbert_iob2_level_1](https://huggingface.co/nlpso/m1_ind_layers_ocr_ptrn_cmbert_iob2_level_1)
* Level 2 : [nlpso/m1_ind_layers_ocr_ptrn_cmbert_iob2_level_2](https://huggingface.co/nlpso/m1_ind_layers_ocr_ptrn_cmbert_iob2_level_2)
## Entity types
Abbreviation|Entity group (level)|Description
-|-|-
O |1 & 2|Outside of a named entity
PER |1|Person or company name
ACT |1 & 2|Person or company professional activity
TITREH |2|Military or civil distinction
DESC |1|Entry full description
TITREP |2|Professionnal reward
SPAT |1|Address
LOC |2|Street name
CARDINAL |2|Street number
FT |2|Geographical feature
## How to use this dataset
```python
from datasets import load_dataset
train_dev_test = load_dataset("nlpso/m1_fine_tuning_ocr_ptrn_cmbert_iob2")
|
open-llm-leaderboard/details_Nekochu__Llama-2-13B-German-ORPO | ---
pretty_name: Evaluation run of Nekochu/Llama-2-13B-German-ORPO
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Nekochu/Llama-2-13B-German-ORPO](https://huggingface.co/Nekochu/Llama-2-13B-German-ORPO)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Nekochu__Llama-2-13B-German-ORPO\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T13:20:29.082617](https://huggingface.co/datasets/open-llm-leaderboard/details_Nekochu__Llama-2-13B-German-ORPO/blob/main/results_2024-04-15T13-20-29.082617.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.532542366507223,\n\
\ \"acc_stderr\": 0.03382149520120546,\n \"acc_norm\": 0.5390973221013382,\n\
\ \"acc_norm_stderr\": 0.03456643230807577,\n \"mc1\": 0.2802937576499388,\n\
\ \"mc1_stderr\": 0.015723139524608763,\n \"mc2\": 0.4243655679693484,\n\
\ \"mc2_stderr\": 0.015193280285346479\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.507679180887372,\n \"acc_stderr\": 0.014609667440892567,\n\
\ \"acc_norm\": 0.5477815699658704,\n \"acc_norm_stderr\": 0.014544519880633832\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.599681338378809,\n\
\ \"acc_stderr\": 0.004889615413144191,\n \"acc_norm\": 0.790479984066919,\n\
\ \"acc_norm_stderr\": 0.004061343422198776\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411022,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411022\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5460526315789473,\n \"acc_stderr\": 0.04051646342874142,\n\
\ \"acc_norm\": 0.5460526315789473,\n \"acc_norm_stderr\": 0.04051646342874142\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5622641509433962,\n \"acc_stderr\": 0.030533338430467516,\n\
\ \"acc_norm\": 0.5622641509433962,\n \"acc_norm_stderr\": 0.030533338430467516\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5416666666666666,\n\
\ \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.5416666666666666,\n\
\ \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.42,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4393063583815029,\n\
\ \"acc_stderr\": 0.037842719328874674,\n \"acc_norm\": 0.4393063583815029,\n\
\ \"acc_norm_stderr\": 0.037842719328874674\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.043898699568087785,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.043898699568087785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39574468085106385,\n \"acc_stderr\": 0.031967586978353627,\n\
\ \"acc_norm\": 0.39574468085106385,\n \"acc_norm_stderr\": 0.031967586978353627\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.335978835978836,\n \"acc_stderr\": 0.024326310529149135,\n \"\
acc_norm\": 0.335978835978836,\n \"acc_norm_stderr\": 0.024326310529149135\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n\
\ \"acc_stderr\": 0.041049472699033945,\n \"acc_norm\": 0.30158730158730157,\n\
\ \"acc_norm_stderr\": 0.041049472699033945\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.635483870967742,\n \"acc_stderr\": 0.02737987122994325,\n \"acc_norm\"\
: 0.635483870967742,\n \"acc_norm_stderr\": 0.02737987122994325\n },\n\
\ \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4482758620689655,\n\
\ \"acc_stderr\": 0.03499113137676744,\n \"acc_norm\": 0.4482758620689655,\n\
\ \"acc_norm_stderr\": 0.03499113137676744\n },\n \"harness|hendrycksTest-high_school_computer_science|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"\
acc\": 0.6727272727272727,\n \"acc_stderr\": 0.03663974994391245,\n \
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.03663974994391245\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6767676767676768,\n \"acc_stderr\": 0.03332299921070644,\n \"\
acc_norm\": 0.6767676767676768,\n \"acc_norm_stderr\": 0.03332299921070644\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.772020725388601,\n \"acc_stderr\": 0.03027690994517826,\n\
\ \"acc_norm\": 0.772020725388601,\n \"acc_norm_stderr\": 0.03027690994517826\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.025294608023986472,\n\
\ \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.025294608023986472\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230172,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230172\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5084033613445378,\n \"acc_stderr\": 0.0324739027656967,\n \
\ \"acc_norm\": 0.5084033613445378,\n \"acc_norm_stderr\": 0.0324739027656967\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6990825688073394,\n \"acc_stderr\": 0.019664751366802114,\n \"\
acc_norm\": 0.6990825688073394,\n \"acc_norm_stderr\": 0.019664751366802114\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.35185185185185186,\n \"acc_stderr\": 0.032568505702936484,\n \"\
acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.032568505702936484\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7156862745098039,\n \"acc_stderr\": 0.031660096793998116,\n \"\
acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.031660096793998116\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591207,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591207\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.04373313040914761,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.04373313040914761\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6257668711656442,\n \"acc_stderr\": 0.03802068102899616,\n\
\ \"acc_norm\": 0.6257668711656442,\n \"acc_norm_stderr\": 0.03802068102899616\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833586,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833586\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.0465614711001235,\n\
\ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.0465614711001235\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7948717948717948,\n\
\ \"acc_stderr\": 0.02645350805404033,\n \"acc_norm\": 0.7948717948717948,\n\
\ \"acc_norm_stderr\": 0.02645350805404033\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7509578544061303,\n\
\ \"acc_stderr\": 0.015464676163395953,\n \"acc_norm\": 0.7509578544061303,\n\
\ \"acc_norm_stderr\": 0.015464676163395953\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.026226158605124658,\n\
\ \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.026226158605124658\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28268156424581004,\n\
\ \"acc_stderr\": 0.015060381730018111,\n \"acc_norm\": 0.28268156424581004,\n\
\ \"acc_norm_stderr\": 0.015060381730018111\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.027996723180631452,\n\
\ \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.027996723180631452\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5916398713826366,\n\
\ \"acc_stderr\": 0.027917050748484627,\n \"acc_norm\": 0.5916398713826366,\n\
\ \"acc_norm_stderr\": 0.027917050748484627\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6049382716049383,\n \"acc_stderr\": 0.027201117666925654,\n\
\ \"acc_norm\": 0.6049382716049383,\n \"acc_norm_stderr\": 0.027201117666925654\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.39361702127659576,\n \"acc_stderr\": 0.029144544781596147,\n \
\ \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.029144544781596147\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.36897001303780963,\n\
\ \"acc_stderr\": 0.012323936650174859,\n \"acc_norm\": 0.36897001303780963,\n\
\ \"acc_norm_stderr\": 0.012323936650174859\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.0302114796091216,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.0302114796091216\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5441176470588235,\n \"acc_stderr\": 0.020148939420415745,\n \
\ \"acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.020148939420415745\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.031001209039894843,\n\
\ \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.031001209039894843\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n\
\ \"acc_stderr\": 0.03187187537919797,\n \"acc_norm\": 0.7164179104477612,\n\
\ \"acc_norm_stderr\": 0.03187187537919797\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7251461988304093,\n \"acc_stderr\": 0.03424042924691584,\n\
\ \"acc_norm\": 0.7251461988304093,\n \"acc_norm_stderr\": 0.03424042924691584\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2802937576499388,\n\
\ \"mc1_stderr\": 0.015723139524608763,\n \"mc2\": 0.4243655679693484,\n\
\ \"mc2_stderr\": 0.015193280285346479\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7363851617995264,\n \"acc_stderr\": 0.012382849299658463\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1728582259287339,\n \
\ \"acc_stderr\": 0.01041543224620058\n }\n}\n```"
repo_url: https://huggingface.co/Nekochu/Llama-2-13B-German-ORPO
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|arc:challenge|25_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|gsm8k|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hellaswag|10_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T13-20-29.082617.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T13-20-29.082617.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- '**/details_harness|winogrande|5_2024-04-15T13-20-29.082617.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T13-20-29.082617.parquet'
- config_name: results
data_files:
- split: 2024_04_15T13_20_29.082617
path:
- results_2024-04-15T13-20-29.082617.parquet
- split: latest
path:
- results_2024-04-15T13-20-29.082617.parquet
---
# Dataset Card for Evaluation run of Nekochu/Llama-2-13B-German-ORPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Nekochu/Llama-2-13B-German-ORPO](https://huggingface.co/Nekochu/Llama-2-13B-German-ORPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Nekochu__Llama-2-13B-German-ORPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T13:20:29.082617](https://huggingface.co/datasets/open-llm-leaderboard/details_Nekochu__Llama-2-13B-German-ORPO/blob/main/results_2024-04-15T13-20-29.082617.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.532542366507223,
"acc_stderr": 0.03382149520120546,
"acc_norm": 0.5390973221013382,
"acc_norm_stderr": 0.03456643230807577,
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608763,
"mc2": 0.4243655679693484,
"mc2_stderr": 0.015193280285346479
},
"harness|arc:challenge|25": {
"acc": 0.507679180887372,
"acc_stderr": 0.014609667440892567,
"acc_norm": 0.5477815699658704,
"acc_norm_stderr": 0.014544519880633832
},
"harness|hellaswag|10": {
"acc": 0.599681338378809,
"acc_stderr": 0.004889615413144191,
"acc_norm": 0.790479984066919,
"acc_norm_stderr": 0.004061343422198776
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411022,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411022
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5460526315789473,
"acc_stderr": 0.04051646342874142,
"acc_norm": 0.5460526315789473,
"acc_norm_stderr": 0.04051646342874142
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5622641509433962,
"acc_stderr": 0.030533338430467516,
"acc_norm": 0.5622641509433962,
"acc_norm_stderr": 0.030533338430467516
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.04166666666666665,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.04166666666666665
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4393063583815029,
"acc_stderr": 0.037842719328874674,
"acc_norm": 0.4393063583815029,
"acc_norm_stderr": 0.037842719328874674
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.043898699568087785,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.043898699568087785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39574468085106385,
"acc_stderr": 0.031967586978353627,
"acc_norm": 0.39574468085106385,
"acc_norm_stderr": 0.031967586978353627
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.335978835978836,
"acc_stderr": 0.024326310529149135,
"acc_norm": 0.335978835978836,
"acc_norm_stderr": 0.024326310529149135
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.041049472699033945,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.041049472699033945
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.635483870967742,
"acc_stderr": 0.02737987122994325,
"acc_norm": 0.635483870967742,
"acc_norm_stderr": 0.02737987122994325
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.03499113137676744,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.03499113137676744
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.03663974994391245,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.03663974994391245
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6767676767676768,
"acc_stderr": 0.03332299921070644,
"acc_norm": 0.6767676767676768,
"acc_norm_stderr": 0.03332299921070644
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.772020725388601,
"acc_stderr": 0.03027690994517826,
"acc_norm": 0.772020725388601,
"acc_norm_stderr": 0.03027690994517826
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.025294608023986472,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.025294608023986472
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.027309140588230172,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.027309140588230172
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5084033613445378,
"acc_stderr": 0.0324739027656967,
"acc_norm": 0.5084033613445378,
"acc_norm_stderr": 0.0324739027656967
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6990825688073394,
"acc_stderr": 0.019664751366802114,
"acc_norm": 0.6990825688073394,
"acc_norm_stderr": 0.019664751366802114
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.032568505702936484,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.032568505702936484
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.031660096793998116,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.031660096793998116
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7383966244725738,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.7383966244725738,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.04373313040914761,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.04373313040914761
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6257668711656442,
"acc_stderr": 0.03802068102899616,
"acc_norm": 0.6257668711656442,
"acc_norm_stderr": 0.03802068102899616
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833586,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833586
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.0465614711001235,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.0465614711001235
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7948717948717948,
"acc_stderr": 0.02645350805404033,
"acc_norm": 0.7948717948717948,
"acc_norm_stderr": 0.02645350805404033
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7509578544061303,
"acc_stderr": 0.015464676163395953,
"acc_norm": 0.7509578544061303,
"acc_norm_stderr": 0.015464676163395953
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.026226158605124658,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.026226158605124658
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.28268156424581004,
"acc_stderr": 0.015060381730018111,
"acc_norm": 0.28268156424581004,
"acc_norm_stderr": 0.015060381730018111
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.027996723180631452,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.027996723180631452
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5916398713826366,
"acc_stderr": 0.027917050748484627,
"acc_norm": 0.5916398713826366,
"acc_norm_stderr": 0.027917050748484627
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6049382716049383,
"acc_stderr": 0.027201117666925654,
"acc_norm": 0.6049382716049383,
"acc_norm_stderr": 0.027201117666925654
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.39361702127659576,
"acc_stderr": 0.029144544781596147,
"acc_norm": 0.39361702127659576,
"acc_norm_stderr": 0.029144544781596147
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.36897001303780963,
"acc_stderr": 0.012323936650174859,
"acc_norm": 0.36897001303780963,
"acc_norm_stderr": 0.012323936650174859
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.0302114796091216,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.0302114796091216
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.020148939420415745,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.020148939420415745
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.031001209039894843,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.031001209039894843
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.03187187537919797,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.03187187537919797
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7251461988304093,
"acc_stderr": 0.03424042924691584,
"acc_norm": 0.7251461988304093,
"acc_norm_stderr": 0.03424042924691584
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608763,
"mc2": 0.4243655679693484,
"mc2_stderr": 0.015193280285346479
},
"harness|winogrande|5": {
"acc": 0.7363851617995264,
"acc_stderr": 0.012382849299658463
},
"harness|gsm8k|5": {
"acc": 0.1728582259287339,
"acc_stderr": 0.01041543224620058
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
japanese-asr/whisper_transcriptions.reazonspeech.small | ---
dataset_info:
config_name: small
features:
- name: name
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: whisper_transcript
sequence: int64
splits:
- name: train
num_bytes: 7040851288.0
num_examples: 62047
download_size: 6998026638
dataset_size: 7040851288.0
configs:
- config_name: small
data_files:
- split: train
path: small/train-*
---
|
AIHowto/Chilloutmix_woman_regset1 | ---
license: creativeml-openrail-m
---
|
liuyanchen1015/MULTI_VALUE_cola_definite_for_indefinite_articles | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 13564
num_examples: 162
- name: test
num_bytes: 10626
num_examples: 140
- name: train
num_bytes: 100355
num_examples: 1280
download_size: 61176
dataset_size: 124545
---
# Dataset Card for "MULTI_VALUE_cola_definite_for_indefinite_articles"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ledoc/blacko | ---
license: apache-2.0
---
|
Goorm-AI-04/RCS_Image_Stratified_Train_Test | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: rcs_image
dtype: image
- name: drone_type
dtype: string
- name: frequency
dtype: int64
- name: label
dtype:
class_label:
names:
'0': 0
'1': 1
'2': 2
'3': 3
'4': 4
'5': 5
'6': 6
'7': 7
'8': 8
'9': 9
'10': 10
'11': 11
'12': 12
'13': 13
'14': 14
'15': 15
splits:
- name: train
num_bytes: 24972888.0
num_examples: 192
- name: test
num_bytes: 6243222.0
num_examples: 48
download_size: 31218865
dataset_size: 31216110.0
---
# Dataset Card for "RCS_Image_Stratified_Train_Test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kenhktsui/basemath | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: source
dtype: string
- name: Rationale
dtype: string
- name: annotated_formula
dtype: string
- name: linear_formula
dtype: string
splits:
- name: train
num_bytes: 57454541
num_examples: 100000
download_size: 28978379
dataset_size: 57454541
---
# Dataset Card for "basemath"
The objective of minimath is to train the mathematical capability of language model in a diverse setting.
The dataset is composed of sampling from the below dataset:
https://huggingface.co/datasets/math_dataset
https://huggingface.co/datasets/math_qa
https://huggingface.co/datasets/competition_math
https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_math_jsonl
https://huggingface.co/datasets/qwedsacf/grade-school-math-instructions
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
OpenLeecher/GPT4-10k | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
size_categories:
- n<1K
---
The goal of this dataset was to siphon as much money as possible from a 20 dollar subscription that I forgot to cancel. Enjoy.
---
100 diverse GPT4 conversations. Features Coding, Debugging, Story telling, Spatial Thinking, Logical Thinking, Chemistry, Physics, and a conversation or two about Biology and Law.

 |
semeru/code-code-BugFixingSmall | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: validation
num_bytes: 1582636
num_examples: 5835
- name: train
num_bytes: 12633815
num_examples: 46680
- name: test
num_bytes: 1573020
num_examples: 5835
download_size: 0
dataset_size: 15789471
---
# Dataset Card for "BFsmall_finetuning"
## Reference
<pre><code>@article{Mastropaolo2022TransferLearningForCodeRelatedTasks
title={Using Transfer Learning for Code-Related Tasks},
author={Mastropaolo, Antonio and Cooper, Nathan and Nader Palacio, David and Scalabrino, Simone and
Poshyvanyk, Denys and Oliveto, Rocco and Bavota, Gabriele},
journal={arXiv preprint arXiv:2206.08574},
year={2022}
}</code></pre> |
freshpearYoon/vr_train_free_18 | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: filename
dtype: string
- name: NumOfUtterance
dtype: int64
- name: text
dtype: string
- name: samplingrate
dtype: int64
- name: begin_time
dtype: float64
- name: end_time
dtype: float64
- name: speaker_id
dtype: string
- name: directory
dtype: string
splits:
- name: train
num_bytes: 7076719564
num_examples: 10000
download_size: 1262775851
dataset_size: 7076719564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-staging-eval-project-80c2643d-2334-4a14-9912-449e234f13a2-102 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- emotion
eval_info:
task: multi_class_classification
model: autoevaluate/multi-class-classification
metrics: ['matthews_correlation']
dataset_name: emotion
dataset_config: default
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: autoevaluate/multi-class-classification
* Dataset: emotion
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
attuneengineering/Human_Genome_Embedding_Collection | ---
license: mit
language:
- en
tags:
- biology
- medical
pretty_name: Human Reference Genome Vector Embeddings
size_categories:
- 10K<n<100K
--- |
hassanraha/multillm-route-instruct | ---
license: apache-2.0
---
|
NobodyExistsOnTheInternet/testgiftedv2zip | ---
license: mit
---
|
ravithejads/fiqa_trans | ---
dataset_info:
features:
- name: _id
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: text_hi
dtype: string
splits:
- name: train
num_bytes: 158459751
num_examples: 57637
download_size: 72304915
dataset_size: 158459751
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tyzhu/squad_for_gpt_train_1000_100 | ---
dataset_info:
features:
- name: text
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
splits:
- name: train
num_bytes: 3564228.0
num_examples: 1000
- name: validation
num_bytes: 371624
num_examples: 100
download_size: 2479909
dataset_size: 3935852.0
---
# Dataset Card for "squad_for_gpt_train_1000_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
316usman/thematic3c | ---
license: bsd
dataset_info:
features:
- name: text
dtype: string
- name: thematic
dtype: string
- name: sub-thematic
dtype: string
- name: country
dtype: string
- name: document_url
dtype: string
- name: source_url
dtype: string
splits:
- name: train
num_bytes: 10758588
num_examples: 14504
download_size: 3149429
dataset_size: 10758588
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
seedboxai/winogrande_de | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: option1
dtype: string
- name: option2
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 28931
num_examples: 201
download_size: 20696
dataset_size: 28931
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AdapterOcean/dollyaug-standardized_cluster_1_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4135748
num_examples: 2015
download_size: 2483217
dataset_size: 4135748
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "dollyaug-standardized_cluster_1_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nthakur/miracl-raft-sft-instruct-v0.1 | ---
dataset_info:
features:
- name: id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: positive_ids
sequence: string
- name: negative_ids
sequence: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 704280553.6223383
num_examples: 95560
- name: test
num_bytes: 29480140.377661712
num_examples: 4000
download_size: 360967058
dataset_size: 733760694.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
HossainRabby/UpdatedDataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 72929903.08721887
num_examples: 14766
- name: test
num_bytes: 8104968.91278113
num_examples: 1641
download_size: 26912462
dataset_size: 81034872.0
---
# Dataset Card for "hii"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PaulLoisel/mlp_no_cat_dataset | ---
dataset_info:
features:
- name: purchased_products
dtype: int64
- name: review_time_spent
dtype: int64
- name: product_category
dtype: string
- name: label
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 601.2
num_examples: 3
- name: test
num_bytes: 200.4
num_examples: 1
- name: val
num_bytes: 200.4
num_examples: 1
download_size: 12247
dataset_size: 1002.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: val
path: data/val-*
---
# Dataset Card for "mlp_no_cat_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DBQ/Chanel.Product.prices.Italy | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- unknown
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
- image-classification
- feature-extraction
- image-segmentation
- image-to-image
- image-to-text
- object-detection
- summarization
- zero-shot-image-classification
pretty_name: Italy - Chanel - Product-level price list
tags:
- webscraping
- ecommerce
- Chanel
- fashion
- fashion product
- image
- fashion image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: website_name
dtype: string
- name: competence_date
dtype: string
- name: country_code
dtype: string
- name: currency_code
dtype: string
- name: brand
dtype: string
- name: category1_code
dtype: string
- name: category2_code
dtype: string
- name: category3_code
dtype: string
- name: product_code
dtype: string
- name: title
dtype: string
- name: itemurl
dtype: string
- name: imageurl
dtype: string
- name: full_price
dtype: float64
- name: price
dtype: float64
- name: full_price_eur
dtype: float64
- name: price_eur
dtype: float64
- name: flg_discount
dtype: int64
splits:
- name: train
num_bytes: 777179
num_examples: 1426
download_size: 201787
dataset_size: 777179
---
# Chanel web scraped data
## About the website
Operating within the highly competitive and exclusive **luxury fashion industry** in EMEA, notably in **Italy**, **Chanel** occupies a premium market position as one of the worlds most recognized fashion brands. This industry is characterized by high-quality, high-priced goods that are primarily targeted at affluent consumers. The industry’s trend towards digitalization has been particularly illustrative in recent years in response to changing consumer habits. The observed dataset contains **Ecommerce product-list page (PLP) data** on Chanels offerings in Italy, showcasing the brands extensive digital presence. Providing insight into Chanels online strategies, the data underscores the critical role of Ecommerce in Italys luxury fashion landscape.
## Link to **dataset**
[Italy - Chanel - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Chanel%20Product-prices%20Italy/r/recwA4XA1XVKUBLa6)
|
Purefire/toolChoose | ---
dataset_info:
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: text
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 336320
num_examples: 133
download_size: 31566
dataset_size: 336320
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
isp-uv-es/SEN2NAIP | ---
license: cc-by-4.0
---
<center>
<img src="demo/logo.png" width=95%>
</center>
# SEN2NAIP
The increasing demand for high spatial resolution in remote sensing imagery has led to the necessity of super-resolution (SR) algorithms that
convert low-resolution (LR) images into high-resolution (HR) ones. To address this need, we introduce SEN2NAIP, a large remote sensing dataset
designed to support conventional and reference-based SR model training. SEN2NAIP is structured into two components to provide a broad spectrum
of research and application needs. The first component comprises a cross-sensor dataset of 2,851 pairs of LR images from Sentinel-2 L2A and HR
images from the National Agriculture Imagery Program (NAIP). Leveraging this dataset, we developed a degradation model capable of converting NAIP
images to match the characteristics of Sentinel-2 imagery (S2like). Subsequently, this degradation model was utilized to create the second component,
a synthetic dataset comprising 17,657 NAIP and S2like image pairs. With the SEN2NAIP dataset, we aim to provide a valuable resource that facilitates
the exploration of new techniques for enhancing the spatial resolution of Sentinel-2 satellite imagery.
# DOWNLOAD DATASET
```
from huggingface_hub import hf_hub_download
# Donwload cross-sensor dataset
hf_hub_download(
repo_id="isp-uv-es/SEN2NAIP",
repo_type="dataset",
filename="cross-sensor/cross-sensor.zip"
)
# Donwload synthetic dataset
for i in range(1, 19):
hf_hub_download(
repo_id="isp-uv-es/SEN2NAIP",
repo_type="dataset",
filename="synthetic/synthetic_%02d.zip" % i
)
```
# REPRODUCIBLE EXAMPLES
## Load cross-sensor dataset
```{python}
import rioxarray
import torch
DEMO_PATH = "https://huggingface.co/datasets/isp-uv-es/SEN2NAIP/resolve/main/demo/"
cross_sensor_path = DEMO_PATH + "cross-sensor/ROI_0000/"
hr_data = rioxarray.open_rasterio(cross_sensor_path + "hr.tif")
lr_data = rioxarray.open_rasterio(cross_sensor_path + "lr.tif")
hr_torch = torch.from_numpy(hr_data.to_numpy()) / 255
lr_torch = torch.from_numpy(lr_data.to_numpy()) / 10000
```
## Load Synthetic dataset
Available methods: **vae_histogram_matching**, **vae_histogram_matching**, **gamma_multivariate_normal_90**, **gamma_multivariate_normal_75**, **gamma_multivariate_normal_50**,
**gamma_multivariate_normal_25**, **gamma_multivariate_normal_10**.
```{python}
import opensr_degradation
import rioxarray
import datasets
import requests
import tempfile
import torch
import json
def load_metadata(metadata_path: str) -> dict:
tmpfile = tempfile.NamedTemporaryFile(suffix=".json")
with requests.get(metadata_path) as response:
with open(tmpfile.name, "wb") as file:
file.write(response.content)
metadata_json = json.load(open(tmpfile.name, "r"))
return metadata_json
DEMO_PATH = "https://huggingface.co/datasets/isp-uv-es/SEN2NAIP/resolve/main/demo/"
# Synthetic LR and HR data ------------------------------
synthetic_path = DEMO_PATH + "synthetic/ROI_0001/"
hr_early_data = rioxarray.open_rasterio(synthetic_path + "early/01__m_4506807_nw_19_1_20110818.tif")
hr_early_torch = torch.from_numpy(hr_early_data.to_numpy()) / 255
hr_early_metadata = load_metadata(synthetic_path + "late/metadata.json")
lr_hat, hr_hat = opensr_degradation.main.get_s2like(
image=hr_early_torch,
table=hr_early_metadata["sim_histograms"],
model="gamma_multivariate_normal_50"
)
import matplotlib.pyplot as plt
fig, ax = plt.subplots(1, 3, figsize=(10, 5))
ax[0].imshow(hr_early_torch[[3, 1, 2]].permute(1, 2, 0))
ax[0].set_title("NAIP")
ax[1].imshow(hr_hat[[3, 1, 2]].permute(1, 2, 0)*3)
ax[1].set_title("NAIPhat")
ax[2].imshow(lr_hat[[3, 1, 2]].permute(1, 2, 0)*3)
ax[2].set_title("S2like")
plt.show()
```
<center>
<img src="https://github.com/ESAOpenSR/opensr-degradation/assets/16768318/c88fa16e-bbe7-4072-b518-5ab3b7278893" width=100%>
</center>
# CITATION
TODO! |
DIBT/MPEP_SPANISH | ---
size_categories: n<1K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for MPEP_SPANISH
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("DIBT/MPEP_SPANISH")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("DIBT/MPEP_SPANISH")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/conceptual_guides/data_model.html#feedback-dataset) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, **metadata**, **vectors**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| source | Source | text | True | True |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label_selection, multi_label_selection, or ranking.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| target | Target | text | True | Translate the text. | N/A |
The **suggestions** are human or machine generated recommendations for each question to assist the annotator during the annotation process, so those are always linked to the existing questions, and named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above, but the column name is appended with "-suggestion" and the metadata is appended with "-suggestion-metadata".
The **metadata** is a dictionary that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
| Metadata Name | Title | Type | Values | Visible for Annotators |
| ------------- | ----- | ---- | ------ | ---------------------- |
The **guidelines**, are optional as well, and are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"external_id": "165",
"fields": {
"source": "Given the text: An experienced and enthusiastic innovator...you want on your team.\nMargaret Hines is the founder and Principal Consultant of Inspire Marketing, LLC, investing in local businesses, serving the community with business brokerage and marketing consulting. She has an undergraduate degree from Washington University in St. Louis, MO, and an MBA from the University of Wisconsin-Milwaukee.\nMargaret offers consulting in marketing, business sales and turnarounds and franchising. She is also an investor in local businesses.\nPrior to founding Inspire Marketing in 2003, Margaret gained her business acumen, sales and marketing expertise while working at respected Fortune 1000 companies.\nSummarize the background and expertise of Margaret Hines, the founder of Inspire Marketing."
},
"metadata": {
"evolved_from": null,
"kind": "synthetic",
"source": "ultrachat"
},
"responses": [
{
"status": "submitted",
"user_id": "8581ce44-b17e-40a8-81a0-e20b63074c9d",
"values": {
"target": {
"value": "Dado el texto: Una innovadora experimentada y entusiasta... que quieres en tu equipo.\nMargaret Hines es la fundadora y Consultora Principal de Inspire Marketing, LLC, que invierte en negocios locales, sirviendo a la comunidad con consultor\u00eda de negocios y marketing. Ella tiene un t\u00edtulo universitario de la Universidad de Washington en St. Louis, MO, y un MBA de la Universidad de Wisconsin-Milwaukee.\nMargaret ofrece consultor\u00eda en marketing, ventas de negocios, transformaciones de negocios y franquicias. Tambi\u00e9n es inversora en negocios locales.\nAntes de fundar Inspire Marketing en 2003, Margaret adquiri\u00f3 su habilidad para los negocios, experiencia en ventas y marketing mientras trabajaba en respetadas empresas de Fortune 1000.\nResume la formaci\u00f3n y experiencia de Margaret Hines, la fundadora de Inspire Marketing."
}
}
}
],
"suggestions": [
{
"agent": null,
"question_name": "target",
"score": null,
"type": null,
"value": "Dado el texto: Una innovadora experimentada y entusiasta... que quieres en tu equipo.\nMargaret Hines es la fundadora y Consultora Principal de Inspire Marketing, LLC, invirtiendo en negocios locales, sirviendo a la comunidad con consultor\u00eda de negocios y marketing. Ella tiene un t\u00edtulo universitario de la Universidad de Washington en St. Louis, MO, y un MBA de la Universidad de Wisconsin-Milwaukee.\nMargaret ofrece consultor\u00eda en marketing, ventas de negocios, transformaciones de negocios y franquicias. Tambi\u00e9n es inversora en negocios locales.\nAntes de fundar Inspire Marketing en 2003, Margaret adquiri\u00f3 su habilidad para los negocios, experiencia en ventas y marketing mientras trabajaba en respetadas empresas Fortune 1000.\nResumen de la formaci\u00f3n y experiencia de Margaret Hines, la fundadora de Inspire Marketing."
}
],
"vectors": {}
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"external_id": "165",
"metadata": "{\"source\": \"ultrachat\", \"kind\": \"synthetic\", \"evolved_from\": null}",
"source": "Given the text: An experienced and enthusiastic innovator...you want on your team.\nMargaret Hines is the founder and Principal Consultant of Inspire Marketing, LLC, investing in local businesses, serving the community with business brokerage and marketing consulting. She has an undergraduate degree from Washington University in St. Louis, MO, and an MBA from the University of Wisconsin-Milwaukee.\nMargaret offers consulting in marketing, business sales and turnarounds and franchising. She is also an investor in local businesses.\nPrior to founding Inspire Marketing in 2003, Margaret gained her business acumen, sales and marketing expertise while working at respected Fortune 1000 companies.\nSummarize the background and expertise of Margaret Hines, the founder of Inspire Marketing.",
"target": [
{
"status": "submitted",
"user_id": "8581ce44-b17e-40a8-81a0-e20b63074c9d",
"value": "Dado el texto: Una innovadora experimentada y entusiasta... que quieres en tu equipo.\nMargaret Hines es la fundadora y Consultora Principal de Inspire Marketing, LLC, que invierte en negocios locales, sirviendo a la comunidad con consultor\u00eda de negocios y marketing. Ella tiene un t\u00edtulo universitario de la Universidad de Washington en St. Louis, MO, y un MBA de la Universidad de Wisconsin-Milwaukee.\nMargaret ofrece consultor\u00eda en marketing, ventas de negocios, transformaciones de negocios y franquicias. Tambi\u00e9n es inversora en negocios locales.\nAntes de fundar Inspire Marketing en 2003, Margaret adquiri\u00f3 su habilidad para los negocios, experiencia en ventas y marketing mientras trabajaba en respetadas empresas de Fortune 1000.\nResume la formaci\u00f3n y experiencia de Margaret Hines, la fundadora de Inspire Marketing."
}
],
"target-suggestion": "Dado el texto: Una innovadora experimentada y entusiasta... que quieres en tu equipo.\nMargaret Hines es la fundadora y Consultora Principal de Inspire Marketing, LLC, invirtiendo en negocios locales, sirviendo a la comunidad con consultor\u00eda de negocios y marketing. Ella tiene un t\u00edtulo universitario de la Universidad de Washington en St. Louis, MO, y un MBA de la Universidad de Wisconsin-Milwaukee.\nMargaret ofrece consultor\u00eda en marketing, ventas de negocios, transformaciones de negocios y franquicias. Tambi\u00e9n es inversora en negocios locales.\nAntes de fundar Inspire Marketing en 2003, Margaret adquiri\u00f3 su habilidad para los negocios, experiencia en ventas y marketing mientras trabajaba en respetadas empresas Fortune 1000.\nResumen de la formaci\u00f3n y experiencia de Margaret Hines, la fundadora de Inspire Marketing.",
"target-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
}
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
* **source** is of type `text`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **target** is of type `text`, and description "Translate the text.".
* **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **target-suggestion** is of type `text`.
Additionally, we also have two more fields that are optional and are the following:
* **metadata:** This is an optional field that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
This is a translation dataset that contains texts. Please translate the text in the text field.
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
cheeusus94/Pronunciament | ---
license: openrail
---
|
choisy/dataset | ---
license: mit
---
|
Sidd2899/MyspeechASR | ---
pretty_name: LibriSpeech
annotations_creators:
- expert-generated
language_creators:
- crowdsourced
- expert-generated
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
paperswithcode_id: librispeech-1
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- automatic-speech-recognition
- audio-classification
task_ids:
- speaker-identification
---
# Dataset Card for librispeech_asr
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [LibriSpeech ASR corpus](http://www.openslr.org/12)
- **Repository:** [Needs More Information]
- **Paper:** [LibriSpeech: An ASR Corpus Based On Public Domain Audio Books](https://www.danielpovey.com/files/2015_icassp_librispeech.pdf)
- **Leaderboard:** [The 🤗 Speech Bench](https://huggingface.co/spaces/huggingface/hf-speech-bench)
- **Point of Contact:** [Daniel Povey](mailto:dpovey@gmail.com)
### Dataset Summary
LibriSpeech is a corpus of approximately 1000 hours of 16kHz read English speech, prepared by Vassil Panayotov with the assistance of Daniel Povey. The data is derived from read audiobooks from the LibriVox project, and has been carefully segmented and aligned.
### Supported Tasks and Leaderboards
- `automatic-speech-recognition`, `audio-speaker-identification`: The dataset can be used to train a model for Automatic Speech Recognition (ASR). The model is presented with an audio file and asked to transcribe the audio file to written text. The most common evaluation metric is the word error rate (WER). The task has an active Hugging Face leaderboard which can be found at https://huggingface.co/spaces/huggingface/hf-speech-bench. The leaderboard ranks models uploaded to the Hub based on their WER. An external leaderboard at https://paperswithcode.com/sota/speech-recognition-on-librispeech-test-clean ranks the latest models from research and academia.
### Languages
The audio is in English. There are two configurations: `clean` and `other`.
The speakers in the corpus were ranked according to the WER of the transcripts of a model trained on
a different dataset, and were divided roughly in the middle,
with the lower-WER speakers designated as "clean" and the higher WER speakers designated as "other".
## Dataset Structure
### Data Instances
A typical data point comprises the path to the audio file, usually called `file` and its transcription, called `text`. Some additional information about the speaker and the passage which contains the transcription is provided.
```
{'chapter_id': 141231,
'file': '/home/siddhant/.cache/huggingface/datasets/downloads/extracted/b7ded9969e09942ab65313e691e6fc2e12066192ee8527e21d634aca128afbe2/dev_clean/1272/141231/1272-141231-0000.flac',
'audio': {'path': '/home/siddhant/.cache/huggingface/datasets/downloads/extracted/b7ded9969e09942ab65313e691e6fc2e12066192ee8527e21d634aca128afbe2/dev_clean/1272/141231/1272-141231-0000.flac',
'array': array([-0.00048828, -0.00018311, -0.00137329, ..., 0.00079346,
0.00091553, 0.00085449], dtype=float32),
'sampling_rate': 16000},
'id': '1272-141231-0000',
'speaker_id': 1272,
'text': 'A MAN SAID TO THE UNIVERSE SIR I EXIST'}
```
### Data Fields
- file: A path to the downloaded audio file in .flac format.
- audio: A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, *i.e.* `dataset[0]["audio"]` should **always** be preferred over `dataset["audio"][0]`.
- text: the transcription of the audio file.
- id: unique id of the data sample.
- speaker_id: unique id of the speaker. The same speaker id can be found for multiple data samples.
- chapter_id: id of the audiobook chapter which includes the transcription.
### Data Splits
The size of the corpus makes it impractical, or at least inconvenient
for some users, to distribute it as a single large archive. Thus the
training portion of the corpus is split into three subsets, with approximate size 100, 360 and 500 hours respectively.
A simple automatic
procedure was used to select the audio in the first two sets to be, on
average, of higher recording quality and with accents closer to US
English. An acoustic model was trained on WSJ’s si-84 data subset
and was used to recognize the audio in the corpus, using a bigram
LM estimated on the text of the respective books. We computed the
Word Error Rate (WER) of this automatic transcript relative to our
reference transcripts obtained from the book texts.
The speakers in the corpus were ranked according to the WER of
the WSJ model’s transcripts, and were divided roughly in the middle,
with the lower-WER speakers designated as "clean" and the higher-WER speakers designated as "other".
For "clean", the data is split into train, validation, and test set. The train set is further split into train.100 and train.360
respectively accounting for 100h and 360h of the training data.
For "other", the data is split into train, validation, and test set. The train set contains approximately 500h of recorded speech.
| | Train.500 | Train.360 | Train.100 | Valid | Test |
| ----- | ------ | ----- | ---- | ---- | ---- |
| clean | - | 104014 | 28539 | 2703 | 2620|
| other | 148688 | - | - | 2864 | 2939 |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in this dataset.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
The dataset was initially created by Vassil Panayotov, Guoguo Chen, Daniel Povey, and Sanjeev Khudanpur.
### Licensing Information
[CC BY 4.0](https://creativecommons.org/licenses/by/4.0/)
### Citation Information
```
@inproceedings{panayotov2015librispeech,
title={Myspeech: an ASR corpus based on public domain audio books},
author={Panayotov, Vassil and Chen, Guoguo and Povey, Daniel and Khudanpur, Sanjeev},
booktitle={Acoustics, Speech and Signal Processing (ICASSP), 2015 IEEE International Conference on},
pages={5206--5210},
year={2015},
organization={IEEE}
}
```
### Contributions
Thanks to [@patrickvonplaten](https://github.com/patrickvonplaten) for adding this dataset. |
chenghao/ledgar_qa | ---
license: mit
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.