datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
CyberHarem/onitsuka_tomari_lovelivesuperstar | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of onitsuka_tomari (Love Live! Superstar!!)
This is the dataset of onitsuka_tomari (Love Live! Superstar!!), containing 36 images and their tags.
The core tags of this character are `bangs, braid, twin_braids, twintails, green_hair, ribbon, long_hair, red_eyes, breasts, neck_ribbon, red_ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 36 | 65.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/onitsuka_tomari_lovelivesuperstar/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 36 | 30.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/onitsuka_tomari_lovelivesuperstar/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 96 | 70.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/onitsuka_tomari_lovelivesuperstar/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 36 | 54.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/onitsuka_tomari_lovelivesuperstar/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 96 | 113.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/onitsuka_tomari_lovelivesuperstar/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/onitsuka_tomari_lovelivesuperstar',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 36 |  |  |  |  |  | 1girl, looking_at_viewer, solo, blush, long_sleeves, school_uniform, dress, white_shirt, blue_jacket, open_mouth, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | blush | long_sleeves | school_uniform | dress | white_shirt | blue_jacket | open_mouth | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------|:---------------|:-----------------|:--------|:--------------|:--------------|:-------------|:--------------------|
| 0 | 36 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X |
|
qazisaad/llama_2-product-titles-esci-test-sft-temp | ---
dataset_info:
features:
- name: index
dtype: int64
- name: query
dtype: string
- name: average_score
dtype: float64
- name: total_score
dtype: float64
- name: text
dtype: string
- name: label
dtype: string
- name: preds
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 5401147
num_examples: 13996
download_size: 1569052
dataset_size: 5401147
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama_2-product-titles-esci-test-sft-temp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sproos/scifact-de | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: queries
num_bytes: 134678
num_examples: 1109
- name: corpus
num_bytes: 9082081
num_examples: 5183
download_size: 78703
dataset_size: 9216759
---
# Dataset Card for "scifact-de"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
seungheondoh/LP-MusicCaps-MSD | ---
language:
- en
tags:
- art
- music
- text-to-music
- music-to-text
pretty_name: LP-MusicCaps-MSD
size_categories:
- 100K<n<1M
---
======================================
**!important**: Be careful when using `caption_attribute_prediction` (We don't recommend to use)!
======================================
# Dataset Card for LP-MusicCaps-MSD
## Dataset Description
- **Repository:** [LP-MusicCaps repository](https://github.com/seungheondoh/lp-music-caps)
- **Paper:** [ArXiv](https://arxiv.org/abs/2307.16372)
## Dataset Summary
**LP-MusicCaps** is a Large Language Model based Pseudo Music Caption dataset for `text-to-music` and `music-to-text` tasks. We construct the music-to-caption pairs with tag-to-caption generation (using three existing multi-label tag datasets and four task instructions). The data sources are MusicCaps, Magnatagtune, and Million Song Dataset ECALS subset.
- **LP-MusicCaps MSD (This Repo)**: 0.5M Audio with 2.2M Caption. We utilize 1054 unique tags in the [MSD-ECALS](https://github.com/SeungHeonDoh/msd-subsets) to perform tag-to-caption generation through LLM.
- [LP-MusicCaps MTT](https://huggingface.co/datasets/seungheondoh/LP-MusicCaps-MTT): 22k Audio with 88k Caption
- [LP-MusicCaps MC](https://huggingface.co/datasets/seungheondoh/LP-MusicCaps-MC): 6k Audio with 22k Caption.
## Data Instances
Each instance in LP-MusicCaps MSD (This Repo) represents multiple image-text pair information with meta-attributes:
```
{
'track_id': 'TRIHXPZ128F1466744',
'title': 'In The Sunshine',
'artist_name': 'ARRESTED DEVELOPMENT',
'release': 'Zingalamaduni',
'year': 1994,
'tag': ['laid back mellow',
'hip hop',
'rnb',
'amiable good natured',
'rap',
'urban',
'gentle',
'political rap',
'soul',
'calm peaceful',
'summery',
'cheerful',
'alternative rap'
],
'caption_writing': 'An amiable and laid back alternative rap tune, this summery and cheerful song blends elements of soul and R&B with a gentle, mellow rap flow to create a calm and peaceful urban vibe that is both hip hop and political in its message.',
'caption_summary': 'This summery, alternative rap song is a mellow and gentle blend of hip hop, RnB, and political rap with a cheerful and amiable good natured vibe.',
'caption_paraphrase': 'This laid back mellow rap song infuses soulful and urban elements while showcasing a gentle and amiable good natured vibe, perfect for a summery day. With hints of cheerful R&B and hip hop, the alternative political rap lyrics bring balance to this peaceful and calming tune.',
'caption_attribute_prediction': 'This mellow, soulful tune is a perfect blend of rap and RnB, with a gentle beat and smooth flow that will transport you to the laid-back urban vibes of a sunny summertime day. The amiable good-natured lyrics touch on political themes, while the alternative rap style adds a cheerful, upbeat twist to the message. Overall, this is a hip-hop gem thats sure to put you in a peaceful, calm state of mind.',
'path': '3/0/303545.clip.mp3'
}
```
## Pseudo Caption Example:
Input Tags:
*"video game theme, no singer, instrumental, analog sounding, small keyboard, beatboxing, playful, cheerful, groovy"*
Output Pseudo Captions
*"instrumental track has a joyful and playful vibe, perfect for a video game theme. With no singer, the analog-sounding music features a small keyboard and beatboxing, creating a groovy and cheerful atmosphere"*
[More Information for pseudo caption generation](https://github.com/seungheondoh/lp-music-caps/blob/main/lpmc/llm_captioning/generate.py)
## Data Fields
| Name | Type | Description |
|------------------------------|-----------------|----------------------------------------------------------------------|
| track_id | string | Unique identifier for the track |
| title | string | Title of the song |
| artist_name | string | Name of the artist performing the song |
| release | string | Release name or album name of the song |
| year | integer | Year of the song's release |
| tag | list of strings | List of tags associated with the song |
| caption_writing | string | Pseudo caption generated through a writing instruction |
| caption_summary | string | Pseudo caption generated through a summary instruction |
| caption_paraphrase | string | Pseudo caption generated through a paraphrase instruction |
| caption_attribute_prediction | string | Pseudo caption generated through an attribute_prediction instruction |
| path | string | File path or location of the audio clip |
## Data Splits
- train: 444865
- valid: 34481
- test: 34631
## Considerations for Using the Data
The LP-MusicCaps dataset is recommended to be used for research purposes. Due to the wrong labeling issue, we recommend not using caption_attribute_prediction and pseudo_attribute unless it is specifically for large-scale pretraining. Additionally, the field "is_crawled" indicates the samples used in the reference paper mentioned below.
## Discussion of Biases
It will be described in a paper to be released soon.
## Other Known Limitations
It will be described in a paper to be released soon. |
himanshusrivastava/indian_food_images | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': burger
'1': butter_naan
'2': chai
'3': chapati
'4': chole_bhature
'5': dal_makhani
'6': dhokla
'7': fried_rice
'8': idli
'9': jalebi
'10': kaathi_rolls
'11': kadai_paneer
'12': kulfi
'13': masala_dosa
'14': momos
'15': paani_puri
'16': pakode
'17': pav_bhaji
'18': pizza
'19': samosa
splits:
- name: train
num_bytes: 1221949355.0811334
num_examples: 5015
- name: test
num_bytes: 332097930.20786667
num_examples: 1254
download_size: 1601728821
dataset_size: 1554047285.289
---
# Dataset Card for "indian_food_images"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
unaidedelf87777/riddler | ---
dataset_info:
features:
- name: riddle
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 7268944
num_examples: 4198
download_size: 3995459
dataset_size: 7268944
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
---
# Dataset Card for "riddler"
4.1 thousand high quality hand vetted riddles, augmented with gpt-4-turbo ( before they made it suck ). |
CyberHarem/yagami_makino_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of yagami_makino/ๅ
ซ็ฅใใญใ/์ผ๊ฐ๋ฏธ๋งํค๋
ธ (THE iDOLM@STER: Cinderella Girls)
This is the dataset of yagami_makino/ๅ
ซ็ฅใใญใ/์ผ๊ฐ๋ฏธ๋งํค๋
ธ (THE iDOLM@STER: Cinderella Girls), containing 216 images and their tags.
The core tags of this character are `glasses, long_hair, breasts, brown_hair, semi-rimless_eyewear, purple_eyes, purple_hair, bangs, large_breasts, blue_eyes, under-rim_eyewear, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 216 | 265.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yagami_makino_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 216 | 161.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yagami_makino_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 506 | 333.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yagami_makino_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 216 | 240.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yagami_makino_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 506 | 462.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yagami_makino_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yagami_makino_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, open_shirt, solo, navel, blush, undressing, blue_bra, blue_panties, collarbone, lace-trimmed_bra, parted_lips, purple_bra, purple_panties, simple_background, white_shirt |
| 1 | 8 |  |  |  |  |  | 1girl, looking_at_viewer, solo, collared_shirt, school_uniform, upper_body, vest, white_shirt, simple_background, smile, white_background, black_necktie, blush, closed_mouth, grey-framed_eyewear, short_sleeves |
| 2 | 6 |  |  |  |  |  | 1girl, looking_at_viewer, plaid_skirt, pleated_skirt, short_sleeves, solo, collared_shirt, school_uniform, simple_background, white_shirt, closed_mouth, grey_eyes, vest, white_background, blue_necktie, open_clothes, smile |
| 3 | 12 |  |  |  |  |  | 1girl, bare_shoulders, looking_at_viewer, smile, solo, necklace, blush, collarbone, off-shoulder_shirt, upper_body, closed_mouth, bracelet, white_shirt, simple_background |
| 4 | 11 |  |  |  |  |  | cleavage, day, outdoors, 1girl, blue_sky, cloud, looking_at_viewer, red_bikini, collarbone, floral_print, smile, solo, beach, navel, blush, grey-framed_eyewear, ocean, bare_shoulders, frilled_bikini, earrings, hair_between_eyes, off_shoulder, open_clothes, open_mouth, upper_body |
| 5 | 11 |  |  |  |  |  | 1girl, solo, elbow_gloves, fingerless_gloves, hair_ornament, smile, black_gloves, cleavage_cutout, garter_straps, looking_at_viewer, thighhighs, white_background |
| 6 | 10 |  |  |  |  |  | rabbit_ears, 1girl, detached_collar, fake_animal_ears, looking_at_viewer, playboy_bunny, solo, wrist_cuffs, cleavage, black_leotard, black_bowtie, strapless_leotard, rabbit_tail, simple_background, white_background, bare_shoulders, brown_pantyhose, hairband, sweatdrop |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | looking_at_viewer | open_shirt | solo | navel | blush | undressing | blue_bra | blue_panties | collarbone | lace-trimmed_bra | parted_lips | purple_bra | purple_panties | simple_background | white_shirt | collared_shirt | school_uniform | upper_body | vest | smile | white_background | black_necktie | closed_mouth | grey-framed_eyewear | short_sleeves | plaid_skirt | pleated_skirt | grey_eyes | blue_necktie | open_clothes | bare_shoulders | necklace | off-shoulder_shirt | bracelet | day | outdoors | blue_sky | cloud | red_bikini | floral_print | beach | ocean | frilled_bikini | earrings | hair_between_eyes | off_shoulder | open_mouth | elbow_gloves | fingerless_gloves | hair_ornament | black_gloves | cleavage_cutout | garter_straps | thighhighs | rabbit_ears | detached_collar | fake_animal_ears | playboy_bunny | wrist_cuffs | black_leotard | black_bowtie | strapless_leotard | rabbit_tail | brown_pantyhose | hairband | sweatdrop |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:--------------------|:-------------|:-------|:--------|:--------|:-------------|:-----------|:---------------|:-------------|:-------------------|:--------------|:-------------|:-----------------|:--------------------|:--------------|:-----------------|:-----------------|:-------------|:-------|:--------|:-------------------|:----------------|:---------------|:----------------------|:----------------|:--------------|:----------------|:------------|:---------------|:---------------|:-----------------|:-----------|:---------------------|:-----------|:------|:-----------|:-----------|:--------|:-------------|:---------------|:--------|:--------|:-----------------|:-----------|:--------------------|:---------------|:-------------|:---------------|:--------------------|:----------------|:---------------|:------------------|:----------------|:-------------|:--------------|:------------------|:-------------------|:----------------|:--------------|:----------------|:---------------|:--------------------|:--------------|:------------------|:-----------|:------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | | X | | X | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | | X | | X | | | | | | | | | | | X | X | X | X | | X | X | X | | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 12 |  |  |  |  |  | X | | X | | X | | X | | | | X | | | | | X | X | | | X | | X | | | X | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 11 |  |  |  |  |  | X | X | X | | X | X | X | | | | X | | | | | | | | | X | | X | | | | X | | | | | | X | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 5 | 11 |  |  |  |  |  | X | | X | | X | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 6 | 10 |  |  |  |  |  | X | X | X | | X | | | | | | | | | | | X | | | | | | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/jintsu_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of jintsu/็ฅ้/็ฅ้ (Azur Lane)
This is the dataset of jintsu/็ฅ้/็ฅ้ (Azur Lane), containing 77 images and their tags.
The core tags of this character are `animal_ears, long_hair, blue_eyes, breasts, blue_hair, fox_ears, hair_ornament, large_breasts, ponytail, animal_ear_fluff, tail, hair_flower, fox_tail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 77 | 128.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jintsu_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 77 | 67.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jintsu_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 190 | 144.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jintsu_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 77 | 110.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jintsu_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 190 | 217.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jintsu_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/jintsu_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 64 |  |  |  |  |  | 1girl, solo, looking_at_viewer, flower, wide_sleeves, folding_fan, holding_fan, bare_shoulders, cleavage, smile, skirt, kimono, detached_sleeves, obi, simple_background, fur_scarf |
| 1 | 10 |  |  |  |  |  | 1girl, flower, looking_at_viewer, solo, ass, blue_bikini, blush, smile, white_bikini, looking_back, bangs, outdoors |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | flower | wide_sleeves | folding_fan | holding_fan | bare_shoulders | cleavage | smile | skirt | kimono | detached_sleeves | obi | simple_background | fur_scarf | ass | blue_bikini | blush | white_bikini | looking_back | bangs | outdoors |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:---------|:---------------|:--------------|:--------------|:-----------------|:-----------|:--------|:--------|:---------|:-------------------|:------|:--------------------|:------------|:------|:--------------|:--------|:---------------|:---------------|:--------|:-----------|
| 0 | 64 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | | | | | | X | | | | | | | X | X | X | X | X | X | X |
|
Jing24/generate_sub_7 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
splits:
- name: train
num_bytes: 21022386
num_examples: 23401
download_size: 3811300
dataset_size: 21022386
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "generate_sub_7"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jacque008/categoriy_240102 | ---
dataset_info:
features:
- name: emailId
dtype: int64
- name: email
dtype: string
splits:
- name: train
num_bytes: 73867
num_examples: 353
download_size: 13279
dataset_size: 73867
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
biwi_kinect_head_pose | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- other
multilinguality:
- monolingual
pretty_name: Biwi Kinect Head Pose Database
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- other
task_ids: []
paperswithcode_id: biwi
tags:
- head-pose-estimation
dataset_info:
features:
- name: sequence_number
dtype: string
- name: subject_id
dtype: string
- name: rgb
sequence: image
- name: rgb_cal
struct:
- name: intrisic_mat
dtype:
array2_d:
shape:
- 3
- 3
dtype: float64
- name: extrinsic_mat
struct:
- name: rotation
dtype:
array2_d:
shape:
- 3
- 3
dtype: float64
- name: translation
sequence: float64
length: 3
- name: depth
sequence: string
- name: depth_cal
struct:
- name: intrisic_mat
dtype:
array2_d:
shape:
- 3
- 3
dtype: float64
- name: extrinsic_mat
struct:
- name: rotation
dtype:
array2_d:
shape:
- 3
- 3
dtype: float64
- name: translation
sequence: float64
length: 3
- name: head_pose_gt
sequence:
- name: center
sequence: float64
length: 3
- name: rotation
dtype:
array2_d:
shape:
- 3
- 3
dtype: float64
- name: head_template
dtype: string
splits:
- name: train
num_bytes: 6914063
num_examples: 24
download_size: 6014398431
dataset_size: 6914063
---
# Dataset Card for Biwi Kinect Head Pose Database
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** [Biwi Kinect Head Pose homepage](https://icu.ee.ethz.ch/research/datsets.html)
- **Repository:** [Needs More Information]
- **Paper:** [Biwi Kinect Head Pose paper](https://link.springer.com/article/10.1007/s11263-012-0549-0)
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** [Gabriele Fanelli](mailto:gabriele.fanelli@gmail.com)
### Dataset Summary
The Biwi Kinect Head Pose Database is acquired with the Microsoft Kinect sensor, a structured IR light device.It contains 15K images of 20 people with 6 females and 14 males where 4 people were recorded twice.
For each frame, there is :
- a depth image,
- a corresponding rgb image (both 640x480 pixels),
- annotation
The head pose range covers about +-75 degrees yaw and +-60 degrees pitch. The ground truth is the 3D location of the head and its rotation.
### Data Processing
Example code for reading a compressed binary depth image file provided by the authors.
<details>
<summary> View C++ Code </summary>
```cpp
/*
* Gabriele Fanelli
*
* fanelli@vision.ee.ethz.ch
*
* BIWI, ETHZ, 2011
*
* Part of the Biwi Kinect Head Pose Database
*
* Example code for reading a compressed binary depth image file.
*
* THE SOFTWARE IS PROVIDED โAS ISโ AND THE PROVIDER GIVES NO EXPRESS OR IMPLIED WARRANTIES OF ANY KIND,
* INCLUDING WITHOUT LIMITATION THE WARRANTIES OF FITNESS FOR ANY PARTICULAR PURPOSE AND NON-INFRINGEMENT.
* IN NO EVENT SHALL THE PROVIDER BE HELD RESPONSIBLE FOR LOSS OR DAMAGE CAUSED BY THE USE OF THE SOFTWARE.
*
*
*/
#include <iostream>
#include <fstream>
#include <cstdlib>
int16_t* loadDepthImageCompressed( const char* fname ){
//now read the depth image
FILE* pFile = fopen(fname, "rb");
if(!pFile){
std::cerr << "could not open file " << fname << std::endl;
return NULL;
}
int im_width = 0;
int im_height = 0;
bool success = true;
success &= ( fread(&im_width,sizeof(int),1,pFile) == 1 ); // read width of depthmap
success &= ( fread(&im_height,sizeof(int),1,pFile) == 1 ); // read height of depthmap
int16_t* depth_img = new int16_t[im_width*im_height];
int numempty;
int numfull;
int p = 0;
while(p < im_width*im_height ){
success &= ( fread( &numempty,sizeof(int),1,pFile) == 1 );
for(int i = 0; i < numempty; i++)
depth_img[ p + i ] = 0;
success &= ( fread( &numfull,sizeof(int), 1, pFile) == 1 );
success &= ( fread( &depth_img[ p + numempty ], sizeof(int16_t), numfull, pFile) == (unsigned int) numfull );
p += numempty+numfull;
}
fclose(pFile);
if(success)
return depth_img;
else{
delete [] depth_img;
return NULL;
}
}
float* read_gt(const char* fname){
//try to read in the ground truth from a binary file
FILE* pFile = fopen(fname, "rb");
if(!pFile){
std::cerr << "could not open file " << fname << std::endl;
return NULL;
}
float* data = new float[6];
bool success = true;
success &= ( fread( &data[0], sizeof(float), 6, pFile) == 6 );
fclose(pFile);
if(success)
return data;
else{
delete [] data;
return NULL;
}
}
```
</details>
### Supported Tasks and Leaderboards
Biwi Kinect Head Pose Database supports the following tasks :
- Head pose estimation
- Pose estimation
- Face verification
### Languages
[Needs More Information]
## Dataset Structure
### Data Instances
A sample from the Biwi Kinect Head Pose dataset is provided below:
```
{
'sequence_number': '12',
'subject_id': 'M06',
'rgb': [<PIL.PngImagePlugin.PngImageFile image mode=RGB size=640x480 at 0x7F53A6446C10>,.....],
'rgb_cal':
{
'intrisic_mat': [[517.679, 0.0, 320.0], [0.0, 517.679, 240.5], [0.0, 0.0, 1.0]],
'extrinsic_mat':
{
'rotation': [[0.999947, 0.00432361, 0.00929419], [-0.00446314, 0.999877, 0.0150443], [-0.009228, -0.015085, 0.999844]],
'translation': [-24.0198, 5.8896, -13.2308]
}
}
'depth': ['../hpdb/12/frame_00003_depth.bin', .....],
'depth_cal':
{
'intrisic_mat': [[575.816, 0.0, 320.0], [0.0, 575.816, 240.0], [0.0, 0.0, 1.0]],
'extrinsic_mat':
{
'rotation': [[1.0, 0.0, 0.0], [0.0, 1.0, 0.0], [0.0, 0.0, 1.0]],
'translation': [0.0, 0.0, 0.0]
}
}
'head_pose_gt':
{
'center': [[43.4019, -30.7038, 906.864], [43.0202, -30.8683, 906.94], [43.0255, -30.5611, 906.659], .....],
'rotation': [[[0.980639, 0.109899, 0.162077], [-0.11023, 0.993882, -0.00697376], [-0.161851, -0.011027, 0.986754]], ......]
}
}
```
### Data Fields
- `sequence_number` : This refers to the sequence number in the dataset. There are a total of 24 sequences.
- `subject_id` : This refers to the subjects in the dataset. There are a total of 20 people with 6 females and 14 males where 4 people were recorded twice.
- `rgb` : List of png frames containing the poses.
- `rgb_cal`: Contains calibration information for the color camera which includes intrinsic matrix,
global rotation and translation.
- `depth` : List of depth frames for the poses.
- `depth_cal`: Contains calibration information for the depth camera which includes intrinsic matrix, global rotation and translation.
- `head_pose_gt` : Contains ground truth information, i.e., the location of the center of the head in 3D and the head rotation, encoded as a 3x3 rotation matrix.
### Data Splits
All the data is contained in the training set.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
The Biwi Kinect Head Pose Database is acquired with the Microsoft Kinect sensor, a structured IR light device.
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
From Dataset's README :
> The database contains 24 sequences acquired with a Kinect sensor. 20 people (some were recorded twice - 6 women and 14 men) were recorded while turning their heads, sitting in front of the sensor, at roughly one meter of distance.
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
From Dataset's README :
> This database is made available for non-commercial use such as university research and education.
### Citation Information
```bibtex
@article{fanelli_IJCV,
author = {Fanelli, Gabriele and Dantone, Matthias and Gall, Juergen and Fossati, Andrea and Van Gool, Luc},
title = {Random Forests for Real Time 3D Face Analysis},
journal = {Int. J. Comput. Vision},
year = {2013},
month = {February},
volume = {101},
number = {3},
pages = {437--458}
}
```
### Contributions
Thanks to [@dnaveenr](https://github.com/dnaveenr) for adding this dataset. |
sheepy928/rt_merged | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 25082040.23509904
num_examples: 170188
- name: test
num_bytes: 4426363.76490096
num_examples: 30034
download_size: 18535178
dataset_size: 29508404.0
---
# Dataset Card for "cs490_reddit_twitter_merged"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Chukana/ChukanaPhoto | ---
license: apache-2.0
language:
- en
--- |
alisson40889/cidd | ---
license: openrail
---
|
hammer888/interior_style_dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1345413786.311
num_examples: 7233
download_size: 0
dataset_size: 1345413786.311
---
# Dataset Card for "interior_style_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dariadaria/threads_reviews | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: identifier
dtype: int64
- name: review_date
dtype: string
- name: review_description
dtype: string
- name: topic
dtype: string
- name: sentiment
dtype: int64
splits:
- name: train
num_bytes: 11532282
num_examples: 54050
- name: test
num_bytes: 3902570
num_examples: 18033
download_size: 978796
dataset_size: 15434852
---
# Dataset Card for "threads_reviews"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
showchen/zero-Amiya | ---
license: apache-2.0
---
|
ml4pubmed/pubmed-classification-20k | ---
license: apache-2.0
task_categories:
- text-classification
language:
- en
tags:
- pubmed
size_categories:
- 10K<n<100K
---
# ml4pubmed/pubmed-classification-20k
- 20k subset of pubmed text classification from course |
zhengr/HuangdiNeijing | ---
license: mit
---
|
CyberHarem/shigure_kira_honkai3 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of shigure_kira (Houkai 3rd)
This is the dataset of shigure_kira (Houkai 3rd), containing 85 images and their tags.
The core tags of this character are `long_hair, blue_eyes, bangs, blue_hair, breasts, hair_ornament, ponytail, ahoge, hair_between_eyes, very_long_hair, braid, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 85 | 145.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shigure_kira_honkai3/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 85 | 71.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shigure_kira_honkai3/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 199 | 153.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shigure_kira_honkai3/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 85 | 124.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shigure_kira_honkai3/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 199 | 231.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shigure_kira_honkai3/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/shigure_kira_honkai3',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, blue_nails, looking_at_viewer, solo, smile, hairclip, nail_polish, one_eye_closed, detached_sleeves, gloves, open_mouth, blue_dress, cleavage, holding_gun, bare_shoulders, single_thighhigh |
| 1 | 5 |  |  |  |  |  | 1girl, full_body, solo, white_dress, looking_at_viewer, simple_background, white_background, bare_shoulders, black_thighhighs, bow, cleavage, detached_sleeves, hair_flower, holding_sword, white_hair, high_heels, single_glove, white_thighhighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_nails | looking_at_viewer | solo | smile | hairclip | nail_polish | one_eye_closed | detached_sleeves | gloves | open_mouth | blue_dress | cleavage | holding_gun | bare_shoulders | single_thighhigh | full_body | white_dress | simple_background | white_background | black_thighhighs | bow | hair_flower | holding_sword | white_hair | high_heels | single_glove | white_thighhighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:--------------------|:-------|:--------|:-----------|:--------------|:-----------------|:-------------------|:---------|:-------------|:-------------|:-----------|:--------------|:-----------------|:-------------------|:------------|:--------------|:--------------------|:-------------------|:-------------------|:------|:--------------|:----------------|:-------------|:-------------|:---------------|:-------------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | X | X | | | | | X | | | | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X |
|
icaro23/jeanv2 | ---
license: apache-2.0
---
|
benayas/atis_chatgpt_20pct_v0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 432823
num_examples: 4455
download_size: 148776
dataset_size: 432823
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
davidgaofc/PriMa5_inout_clean | ---
dataset_info:
features:
- name: Text
dtype: string
- name: Label
dtype: int64
splits:
- name: train
num_bytes: 809188
num_examples: 910
download_size: 313284
dataset_size: 809188
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Nerfgun3/pastel_style | ---
language:
- en
license: creativeml-openrail-m
thumbnail: "https://huggingface.co/datasets/Nerfgun3/pastel_style/resolve/main/pastel_style.jpg"
tags:
- stable-diffusion
- text-to-image
- image-to-image
inference: false
---
# Pastel Style Embedding / Textual Inversion
<img alt="Showcase" src="https://huggingface.co/datasets/Nerfgun3/pastel_style/resolve/main/pastel_style.jpg"/>
## Usage
To use this embedding you have to download the file aswell as drop it into the "\stable-diffusion-webui\embeddings" folder
To use it in a prompt: ```"pastel_style"```
Personally, I would recommend to use my embeddings with a strength of 0.8, like ```"(pastel_style:0.8)"```
I trained the embedding two epochs until 6000 steps.
I hope you enjoy the embedding. If you have any questions, you can ask me anything via Discord: "Nerfgun3#7508"
## License
This embedding is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage.
The CreativeML OpenRAIL License specifies:
1. You can't use the embedding to deliberately produce nor share illegal or harmful outputs or content
2. The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
3. You may re-distribute the weights and use the embedding commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully)
[Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license) |
mtkinit/another_short_slovak_dataset | ---
pretty_name: another-short-slovak-dataset
---
# another-short-slovak-dataset
Created from AIOD platform |
offbeatPickle/medical | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
license:
- cc0-1.0
multilinguality:
- m
source_datasets:
- extended|common_voice
task_categories:
- automatic-speech-recognition
task_ids: []
paperswithcode_id: common-voice
pretty_name: Common Voice Corpus 11.0
extra_gated_prompt: >-
By clicking on โAccess repositoryโ below, you also agree to not attempt to
determine the identity of speakers in the Common Voice dataset.
language:
- en
--- |
AdiOO7/Bank_Complaints | ---
license: apache-2.0
task_categories:
- table-question-answering
language:
- en
tags:
- finance
size_categories:
- 1K<n<10K
--- |
open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-sparsity-20 | ---
pretty_name: Evaluation run of wang7776/Mistral-7B-Instruct-v0.2-sparsity-20
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [wang7776/Mistral-7B-Instruct-v0.2-sparsity-20](https://huggingface.co/wang7776/Mistral-7B-Instruct-v0.2-sparsity-20)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-sparsity-20\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-15T10:42:05.147679](https://huggingface.co/datasets/open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-sparsity-20/blob/main/results_2024-01-15T10-42-05.147679.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.47207806761545057,\n\
\ \"acc_stderr\": 0.0343560515515407,\n \"acc_norm\": 0.4785715950847517,\n\
\ \"acc_norm_stderr\": 0.03514049991292305,\n \"mc1\": 0.30966952264381886,\n\
\ \"mc1_stderr\": 0.016185744355144905,\n \"mc2\": 0.47217481958020674,\n\
\ \"mc2_stderr\": 0.01506601596455064\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4761092150170648,\n \"acc_stderr\": 0.014594701798071654,\n\
\ \"acc_norm\": 0.5264505119453925,\n \"acc_norm_stderr\": 0.014590931358120174\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5748854809798845,\n\
\ \"acc_stderr\": 0.004933500261683596,\n \"acc_norm\": 0.767078271260705,\n\
\ \"acc_norm_stderr\": 0.004218289279767987\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4934210526315789,\n \"acc_stderr\": 0.040685900502249704,\n\
\ \"acc_norm\": 0.4934210526315789,\n \"acc_norm_stderr\": 0.040685900502249704\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5584905660377358,\n \"acc_stderr\": 0.030561590426731837,\n\
\ \"acc_norm\": 0.5584905660377358,\n \"acc_norm_stderr\": 0.030561590426731837\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4583333333333333,\n\
\ \"acc_stderr\": 0.04166666666666666,\n \"acc_norm\": 0.4583333333333333,\n\
\ \"acc_norm_stderr\": 0.04166666666666666\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411018,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411018\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.42196531791907516,\n\
\ \"acc_stderr\": 0.037657466938651504,\n \"acc_norm\": 0.42196531791907516,\n\
\ \"acc_norm_stderr\": 0.037657466938651504\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.042207736591714506,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.042207736591714506\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3659574468085106,\n \"acc_stderr\": 0.031489558297455304,\n\
\ \"acc_norm\": 0.3659574468085106,\n \"acc_norm_stderr\": 0.031489558297455304\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.19298245614035087,\n\
\ \"acc_stderr\": 0.037124548537213684,\n \"acc_norm\": 0.19298245614035087,\n\
\ \"acc_norm_stderr\": 0.037124548537213684\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31216931216931215,\n \"acc_stderr\": 0.023865206836972602,\n \"\
acc_norm\": 0.31216931216931215,\n \"acc_norm_stderr\": 0.023865206836972602\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.04073524322147125,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.04073524322147125\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5193548387096775,\n\
\ \"acc_stderr\": 0.028422687404312107,\n \"acc_norm\": 0.5193548387096775,\n\
\ \"acc_norm_stderr\": 0.028422687404312107\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3694581280788177,\n \"acc_stderr\": 0.03395970381998574,\n\
\ \"acc_norm\": 0.3694581280788177,\n \"acc_norm_stderr\": 0.03395970381998574\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5515151515151515,\n \"acc_stderr\": 0.038835659779569286,\n\
\ \"acc_norm\": 0.5515151515151515,\n \"acc_norm_stderr\": 0.038835659779569286\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6212121212121212,\n \"acc_stderr\": 0.03456088731993747,\n \"\
acc_norm\": 0.6212121212121212,\n \"acc_norm_stderr\": 0.03456088731993747\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.694300518134715,\n \"acc_stderr\": 0.033248379397581594,\n\
\ \"acc_norm\": 0.694300518134715,\n \"acc_norm_stderr\": 0.033248379397581594\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.025294608023986472,\n\
\ \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.025294608023986472\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.22962962962962963,\n \"acc_stderr\": 0.025644108639267638,\n \
\ \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.025644108639267638\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.031968769891957786,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.031968769891957786\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6256880733944954,\n \"acc_stderr\": 0.020748959408988313,\n \"\
acc_norm\": 0.6256880733944954,\n \"acc_norm_stderr\": 0.020748959408988313\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321617,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321617\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5980392156862745,\n \"acc_stderr\": 0.03441190023482465,\n \"\
acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.03441190023482465\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6244725738396625,\n \"acc_stderr\": 0.03152256243091156,\n \
\ \"acc_norm\": 0.6244725738396625,\n \"acc_norm_stderr\": 0.03152256243091156\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.57847533632287,\n\
\ \"acc_stderr\": 0.03314190222110657,\n \"acc_norm\": 0.57847533632287,\n\
\ \"acc_norm_stderr\": 0.03314190222110657\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\"\
: 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.04668408033024931,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.04668408033024931\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5398773006134969,\n \"acc_stderr\": 0.03915857291436971,\n\
\ \"acc_norm\": 0.5398773006134969,\n \"acc_norm_stderr\": 0.03915857291436971\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6116504854368932,\n \"acc_stderr\": 0.0482572933735639,\n\
\ \"acc_norm\": 0.6116504854368932,\n \"acc_norm_stderr\": 0.0482572933735639\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6837606837606838,\n\
\ \"acc_stderr\": 0.03046365674734025,\n \"acc_norm\": 0.6837606837606838,\n\
\ \"acc_norm_stderr\": 0.03046365674734025\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.644955300127714,\n\
\ \"acc_stderr\": 0.017112085772772994,\n \"acc_norm\": 0.644955300127714,\n\
\ \"acc_norm_stderr\": 0.017112085772772994\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.026919095102908273,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.026919095102908273\n \
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.02849199358617156,\n\
\ \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.02849199358617156\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5176848874598071,\n\
\ \"acc_stderr\": 0.02838032284907713,\n \"acc_norm\": 0.5176848874598071,\n\
\ \"acc_norm_stderr\": 0.02838032284907713\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.0275860062216077,\n\
\ \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.0275860062216077\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3120567375886525,\n \"acc_stderr\": 0.027640120545169924,\n \
\ \"acc_norm\": 0.3120567375886525,\n \"acc_norm_stderr\": 0.027640120545169924\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3494132985658409,\n\
\ \"acc_stderr\": 0.012177306252786691,\n \"acc_norm\": 0.3494132985658409,\n\
\ \"acc_norm_stderr\": 0.012177306252786691\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.0302114796091216,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.0302114796091216\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4297385620915033,\n \"acc_stderr\": 0.020027122784928547,\n \
\ \"acc_norm\": 0.4297385620915033,\n \"acc_norm_stderr\": 0.020027122784928547\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4090909090909091,\n\
\ \"acc_stderr\": 0.04709306978661897,\n \"acc_norm\": 0.4090909090909091,\n\
\ \"acc_norm_stderr\": 0.04709306978661897\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5387755102040817,\n \"acc_stderr\": 0.031912820526692774,\n\
\ \"acc_norm\": 0.5387755102040817,\n \"acc_norm_stderr\": 0.031912820526692774\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6417910447761194,\n\
\ \"acc_stderr\": 0.03390393042268814,\n \"acc_norm\": 0.6417910447761194,\n\
\ \"acc_norm_stderr\": 0.03390393042268814\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.35542168674698793,\n\
\ \"acc_stderr\": 0.03726214354322416,\n \"acc_norm\": 0.35542168674698793,\n\
\ \"acc_norm_stderr\": 0.03726214354322416\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6140350877192983,\n \"acc_stderr\": 0.03733756969066164,\n\
\ \"acc_norm\": 0.6140350877192983,\n \"acc_norm_stderr\": 0.03733756969066164\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30966952264381886,\n\
\ \"mc1_stderr\": 0.016185744355144905,\n \"mc2\": 0.47217481958020674,\n\
\ \"mc2_stderr\": 0.01506601596455064\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6906077348066298,\n \"acc_stderr\": 0.012991329330822993\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11296436694465505,\n \
\ \"acc_stderr\": 0.008719339028833054\n }\n}\n```"
repo_url: https://huggingface.co/wang7776/Mistral-7B-Instruct-v0.2-sparsity-20
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|arc:challenge|25_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|gsm8k|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hellaswag|10_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T10-42-05.147679.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-15T10-42-05.147679.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- '**/details_harness|winogrande|5_2024-01-15T10-42-05.147679.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-15T10-42-05.147679.parquet'
- config_name: results
data_files:
- split: 2024_01_15T10_42_05.147679
path:
- results_2024-01-15T10-42-05.147679.parquet
- split: latest
path:
- results_2024-01-15T10-42-05.147679.parquet
---
# Dataset Card for Evaluation run of wang7776/Mistral-7B-Instruct-v0.2-sparsity-20
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [wang7776/Mistral-7B-Instruct-v0.2-sparsity-20](https://huggingface.co/wang7776/Mistral-7B-Instruct-v0.2-sparsity-20) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-sparsity-20",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-15T10:42:05.147679](https://huggingface.co/datasets/open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-sparsity-20/blob/main/results_2024-01-15T10-42-05.147679.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.47207806761545057,
"acc_stderr": 0.0343560515515407,
"acc_norm": 0.4785715950847517,
"acc_norm_stderr": 0.03514049991292305,
"mc1": 0.30966952264381886,
"mc1_stderr": 0.016185744355144905,
"mc2": 0.47217481958020674,
"mc2_stderr": 0.01506601596455064
},
"harness|arc:challenge|25": {
"acc": 0.4761092150170648,
"acc_stderr": 0.014594701798071654,
"acc_norm": 0.5264505119453925,
"acc_norm_stderr": 0.014590931358120174
},
"harness|hellaswag|10": {
"acc": 0.5748854809798845,
"acc_stderr": 0.004933500261683596,
"acc_norm": 0.767078271260705,
"acc_norm_stderr": 0.004218289279767987
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4934210526315789,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.4934210526315789,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5584905660377358,
"acc_stderr": 0.030561590426731837,
"acc_norm": 0.5584905660377358,
"acc_norm_stderr": 0.030561590426731837
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.04166666666666666,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.04166666666666666
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411018,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411018
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.42196531791907516,
"acc_stderr": 0.037657466938651504,
"acc_norm": 0.42196531791907516,
"acc_norm_stderr": 0.037657466938651504
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.042207736591714506,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.042207736591714506
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3659574468085106,
"acc_stderr": 0.031489558297455304,
"acc_norm": 0.3659574468085106,
"acc_norm_stderr": 0.031489558297455304
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.19298245614035087,
"acc_stderr": 0.037124548537213684,
"acc_norm": 0.19298245614035087,
"acc_norm_stderr": 0.037124548537213684
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31216931216931215,
"acc_stderr": 0.023865206836972602,
"acc_norm": 0.31216931216931215,
"acc_norm_stderr": 0.023865206836972602
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.04073524322147125,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.04073524322147125
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5193548387096775,
"acc_stderr": 0.028422687404312107,
"acc_norm": 0.5193548387096775,
"acc_norm_stderr": 0.028422687404312107
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3694581280788177,
"acc_stderr": 0.03395970381998574,
"acc_norm": 0.3694581280788177,
"acc_norm_stderr": 0.03395970381998574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5515151515151515,
"acc_stderr": 0.038835659779569286,
"acc_norm": 0.5515151515151515,
"acc_norm_stderr": 0.038835659779569286
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6212121212121212,
"acc_stderr": 0.03456088731993747,
"acc_norm": 0.6212121212121212,
"acc_norm_stderr": 0.03456088731993747
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.694300518134715,
"acc_stderr": 0.033248379397581594,
"acc_norm": 0.694300518134715,
"acc_norm_stderr": 0.033248379397581594
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.025294608023986472,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.025294608023986472
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.025644108639267638,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.025644108639267638
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.031968769891957786,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.031968769891957786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6256880733944954,
"acc_stderr": 0.020748959408988313,
"acc_norm": 0.6256880733944954,
"acc_norm_stderr": 0.020748959408988313
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5980392156862745,
"acc_stderr": 0.03441190023482465,
"acc_norm": 0.5980392156862745,
"acc_norm_stderr": 0.03441190023482465
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6244725738396625,
"acc_stderr": 0.03152256243091156,
"acc_norm": 0.6244725738396625,
"acc_norm_stderr": 0.03152256243091156
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.57847533632287,
"acc_stderr": 0.03314190222110657,
"acc_norm": 0.57847533632287,
"acc_norm_stderr": 0.03314190222110657
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.0436923632657398,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.0436923632657398
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04668408033024931,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04668408033024931
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5398773006134969,
"acc_stderr": 0.03915857291436971,
"acc_norm": 0.5398773006134969,
"acc_norm_stderr": 0.03915857291436971
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.6116504854368932,
"acc_stderr": 0.0482572933735639,
"acc_norm": 0.6116504854368932,
"acc_norm_stderr": 0.0482572933735639
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6837606837606838,
"acc_stderr": 0.03046365674734025,
"acc_norm": 0.6837606837606838,
"acc_norm_stderr": 0.03046365674734025
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.644955300127714,
"acc_stderr": 0.017112085772772994,
"acc_norm": 0.644955300127714,
"acc_norm_stderr": 0.017112085772772994
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5,
"acc_stderr": 0.026919095102908273,
"acc_norm": 0.5,
"acc_norm_stderr": 0.026919095102908273
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.02849199358617156,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.02849199358617156
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5176848874598071,
"acc_stderr": 0.02838032284907713,
"acc_norm": 0.5176848874598071,
"acc_norm_stderr": 0.02838032284907713
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.0275860062216077,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.0275860062216077
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3120567375886525,
"acc_stderr": 0.027640120545169924,
"acc_norm": 0.3120567375886525,
"acc_norm_stderr": 0.027640120545169924
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3494132985658409,
"acc_stderr": 0.012177306252786691,
"acc_norm": 0.3494132985658409,
"acc_norm_stderr": 0.012177306252786691
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.0302114796091216,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.0302114796091216
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4297385620915033,
"acc_stderr": 0.020027122784928547,
"acc_norm": 0.4297385620915033,
"acc_norm_stderr": 0.020027122784928547
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4090909090909091,
"acc_stderr": 0.04709306978661897,
"acc_norm": 0.4090909090909091,
"acc_norm_stderr": 0.04709306978661897
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5387755102040817,
"acc_stderr": 0.031912820526692774,
"acc_norm": 0.5387755102040817,
"acc_norm_stderr": 0.031912820526692774
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6417910447761194,
"acc_stderr": 0.03390393042268814,
"acc_norm": 0.6417910447761194,
"acc_norm_stderr": 0.03390393042268814
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-virology|5": {
"acc": 0.35542168674698793,
"acc_stderr": 0.03726214354322416,
"acc_norm": 0.35542168674698793,
"acc_norm_stderr": 0.03726214354322416
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6140350877192983,
"acc_stderr": 0.03733756969066164,
"acc_norm": 0.6140350877192983,
"acc_norm_stderr": 0.03733756969066164
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30966952264381886,
"mc1_stderr": 0.016185744355144905,
"mc2": 0.47217481958020674,
"mc2_stderr": 0.01506601596455064
},
"harness|winogrande|5": {
"acc": 0.6906077348066298,
"acc_stderr": 0.012991329330822993
},
"harness|gsm8k|5": {
"acc": 0.11296436694465505,
"acc_stderr": 0.008719339028833054
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
apoorvasrinivasan/consort-tm | ---
license: mit
---
### Paper
[Toward assessing clinical trial publications for reporting transparency](https://pubmed.ncbi.nlm.nih.gov/33647518/) |
hle2000/Mintaka_Graph_Features_T5-xl-ssm | ---
dataset_info:
features:
- name: question
dtype: string
- name: question_answer
dtype: string
- name: num_nodes
dtype: int64
- name: num_edges
dtype: int64
- name: density
dtype: float64
- name: cycle
dtype: int64
- name: bridge
dtype: int64
- name: katz_centrality
dtype: float64
- name: page_rank
dtype: float64
- name: avg_ssp_length
dtype: float64
- name: graph_sequence
dtype: string
- name: updated_graph_sequence
dtype: string
- name: graph_sequence_embedding
dtype: string
- name: updated_graph_sequence_embedding
dtype: string
- name: question_answer_embedding
dtype: string
- name: tfidf_vector
dtype: string
- name: correct
dtype: float64
splits:
- name: train
num_bytes: 8930682861
num_examples: 86381
- name: test
num_bytes: 2234104926
num_examples: 21574
download_size: 2056059858
dataset_size: 11164787787
---
# Dataset Card for "Mintaka_Graph_Features_T5-xl-ssm"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nlp-vtcc/belle_vi_20k | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 13745755
num_examples: 20000
download_size: 5694988
dataset_size: 13745755
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Reyansh4/Fake-News-Classification | ---
license: cc-by-nd-4.0
task_categories:
- text-classification
language:
- en
tags:
- code
size_categories:
- 10K<n<100K
---
Develop a machine learning program to identify when an article might be fake news. Run by the UTK Machine Learning Club.
This is the Dataset to the Fake-News-Classifier competition in Kaggle. There is a Test csv to check for predictions.
**Citation**
William Lifferth. (2018). Fake News. Kaggle. https://kaggle.com/competitions/fake-news |
KomeijiForce/Inbedder-Pretrain-Data | ---
license: mit
---
|
daler-westerops/chatbot-audio | ---
license: mit
---
|
Snoopy04/arc-de-500 | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
struct:
- name: text
sequence: string
- name: label
sequence: string
- name: answerKey
dtype: string
- name: question_de
dtype: string
- name: choices_de
struct:
- name: label
sequence: string
- name: text
sequence: string
- name: translation_de
dtype: string
splits:
- name: test
num_bytes: 499426.1945392491
num_examples: 500
- name: validation
num_bytes: 296743.34448160534
num_examples: 294
download_size: 452903
dataset_size: 796169.5390208545
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
CyberHarem/julia_nikke | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of julia/ใ
ใฆใชใข/ๅฐค่ไบ/์จ๋ฆฌ์ (Nikke: Goddess of Victory)
This is the dataset of julia/ใ
ใฆใชใข/ๅฐค่ไบ/์จ๋ฆฌ์ (Nikke: Goddess of Victory), containing 19 images and their tags.
The core tags of this character are `bangs, breasts, red_eyes, short_hair, hair_ornament, hair_between_eyes, hair_flower, white_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 19 | 36.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/julia_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 19 | 16.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/julia_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 40 | 32.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/julia_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 19 | 29.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/julia_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 40 | 48.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/julia_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/julia_nikke',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, bare_shoulders, elbow_gloves, solo, white_gloves, black_dress, closed_mouth, looking_at_viewer, cleavage, grey_hair, holding_instrument, red_rose, violin |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | elbow_gloves | solo | white_gloves | black_dress | closed_mouth | looking_at_viewer | cleavage | grey_hair | holding_instrument | red_rose | violin |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:---------------|:-------|:---------------|:--------------|:---------------|:--------------------|:-----------|:------------|:---------------------|:-----------|:---------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
tahrirchi/uz-crawl | ---
annotations_creators:
- no-annotation
language:
- uz
license: apache-2.0
multilinguality:
- monolingual
size_categories:
- 1M<n<10M
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
pretty_name: UzCrawl
configs:
- config_name: default
data_files:
- split: news
path: data/news-*
- split: telegram_blogs
path: data/telegram_blogs-*
dataset_info:
features:
- name: text
dtype: string
- name: timestamp
dtype: string
- name: source
dtype: string
splits:
- name: news
num_bytes: 3272404822
num_examples: 964268
- name: telegram_blogs
num_bytes: 367462330
num_examples: 368017
download_size: 1462920936
dataset_size: 3639867152
tags:
- uz
- crawl
- telegram_blogs
---
# Dataset Card for UzCrawl
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://tahrirchi.uz/grammatika-tekshiruvi](https://tahrirchi.uz/grammatika-tekshiruvi)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 3.52 GB
- **Size of the generated dataset:** 1.58 GB
- **Total amount of disk used:** 5.1 GB
### Dataset Summary
In an effort to democratize research on low-resource languages, we release UzCrawl dataset, a web and telegram crawl corpus consisting of materials from nearly 1.2 million unique sources in the Uzbek Language.
Please refer to our [blogpost](https://tahrirchi.uz/grammatika-tekshiruvi) and paper (Coming soon!) for further details.
To load and use dataset, run this script:
```python
from datasets import load_dataset
uz_crawl=load_dataset("tahrirchi/uz-crawl")
```
## Dataset Structure
### Data Instances
#### plain_text
- **Size of downloaded dataset files:** 3.52 GB
- **Size of the generated dataset:** 1.58 GB
- **Total amount of disk used:** 5.1 GB
An example of 'news' looks as follows.
```
{
'text': "Oโzbekiston Respublikasi Vazirlar Mahkamasining 2019 yil 24 iyuldagi 620-son qarori bilan tasdiqlangan ยซXorijiy davlatlarda ta'lim olganlik toโgโrisidagi hujjatlarni tan olish tartibi toโgโrisidaยปgi Nizom ijrosini ta'minlash maqsadida Ta'lim sifatini nazorat qilish davlat inspeksiyasida (Toshkent shahar, Chilonzor tumani, Nurxon koโchasi, 21-uy) 2019 yil 9 โ14 sentabr kunlari sohalar boโyicha sinov testlari boโlib oโtishi rejalashtirilgan.\nTa'lim sifatini nazorat qilish davlat inspeksiyasi matbuot xizmati xabariga\xa0koโra, ยซHuquqshunoslikยป, ยซSogโliqni saqlash va ijtimoiy ta'minotยป, ยซIqtisodiyotยป, ยซQishloq xoโjaligi, muhandislik, ishlov berish va qurilishยป hamda ยซOโqituvchilar tayyorlash va pedagogik fanlarยป sohalari boโyicha sinov testlari oโtkaziladigan sanasi va sinov testida ishtirok etuvchilar roโyxati jadvalga muvofiq belgilanadi.\nTa'lim sifatini nazorat qilish davlat inspeksiyasi ogohlantirishicha, xorijiy davlatlarda ta'lim olganlik toโgโrisidagi hujjatlarni tan olish uchun belgilangan sinov testlariga oโz vaqtida kelmagan, sinov testida ishtirok etuvchilar roโyxatida mavjud boโlmagan talabgorlarga sinovlarga kirishga ruxsat etilmaydi.",
'timestamp': '2019-06-09',
'source': 'https://kun.uz/uz/news/2019/09/06/xorijda-talim-olganlik-togrisidagi-hujjatlarni-tan-olish-uchun-testlar-otkaziladigan-kunlar-malum-boldi'
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature that contains text.
- `timestamp`: a `string` feature that contains timestamp of the material.
- `source`: a `string` feature that contains url of the material.
### Data Splits
| name | |
|-----------------|--------:|
| news | 964268 |
| telegram_blogs | 227337 |
## Dataset Creation
The news portion have been crawled from 21 different websites using [Scrapy](https://scrapy.org/) framework. And telegram_blogs portion is consisted of manually curated texts from 81 high-quality Telegram channels.
## Citation
Please cite this model using the following format:
```
@online{Mamasaidov2023UzBooks,
author = {Mukhammadsaid Mamasaidov and Abror Shopulatov},
title = {UzCrawl dataset},
year = {2023},
url = {https://huggingface.co/datasets/tahrirchi/uz-crawl},
note = {Accessed: 2023-10-28}, % change this date
urldate = {2023-10-28} % change this date
}
```
## Gratitude
We are thankful to these awesome organizations and people for helping to make it happen:
- [Asadbek Kiyomov](https://www.linkedin.com/in/asadbey): for his works on the beginning of the project.
- [Ilya Gusev](https://github.com/IlyaGusev/): for his advise throughout the process
- [David Dale](https://daviddale.ru): for his advise throughout the process
## Contacts
We believe that this work will inspire all enthusiasts around the world to open the hidden beauty of low-resource languages, in particular of Uzbek.
For further development and issues about the dataset, please use m.mamasaidov@tahrirchi.uz or a.shopolatov@tahrirchi.uz to contact. |
KnutJaegersberg/Interpretable_word_embeddings_large_cskg | ---
license: mit
---
These embeddings result from applying SemAxis (https://arxiv.org/abs/1806.05521) to common sense knowledge graph embeddings (https://arxiv.org/abs/2012.11490).
|
Clumsy1/coco-caps | ---
license: apache-2.0
---
|
subset-data/finetune-data-4127aad49eb7 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 439213.3333333333
num_examples: 56
- name: test
num_bytes: 31372.380952380954
num_examples: 4
- name: valid
num_bytes: 23529.285714285714
num_examples: 3
download_size: 165314
dataset_size: 494115.0
---
# Dataset Card for "finetune-data-4127aad49eb7"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/augmentatio-standardized_cluster_0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 25231331
num_examples: 2360
download_size: 7470973
dataset_size: 25231331
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "augmentatio-standardized_cluster_0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Abdullah-Habib/my_Images | ---
license: apache-2.0
---
|
Kipol/vs_art | ---
license: cc
---
|
open-llm-leaderboard/details_migtissera__Tess-XS-v1.1 | ---
pretty_name: Evaluation run of migtissera/Tess-XS-v1.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [migtissera/Tess-XS-v1.1](https://huggingface.co/migtissera/Tess-XS-v1.1) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__Tess-XS-v1.1_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-23T08:39:10.846213](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-XS-v1.1_public/blob/main/results_2023-11-23T08-39-10.846213.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6253362884117736,\n\
\ \"acc_stderr\": 0.03254975101958803,\n \"acc_norm\": 0.6343561981840767,\n\
\ \"acc_norm_stderr\": 0.0332634036672251,\n \"mc1\": 0.3463892288861689,\n\
\ \"mc1_stderr\": 0.01665699710912514,\n \"mc2\": 0.49923681207340576,\n\
\ \"mc2_stderr\": 0.01551504317540587,\n \"em\": 0.18278104026845637,\n\
\ \"em_stderr\": 0.003957987703151033,\n \"f1\": 0.27069211409396043,\n\
\ \"f1_stderr\": 0.004030013722161818\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5930034129692833,\n \"acc_stderr\": 0.014356399418009126,\n\
\ \"acc_norm\": 0.6390784982935154,\n \"acc_norm_stderr\": 0.014034761386175452\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6512646883091018,\n\
\ \"acc_stderr\": 0.004755960559929163,\n \"acc_norm\": 0.8405696076478789,\n\
\ \"acc_norm_stderr\": 0.003653288043555801\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.049020713000019756,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.049020713000019756\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n\
\ \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \
\ \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.036928207672648664,\n\
\ \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.036928207672648664\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n\
\ \"acc_stderr\": 0.047840607041056527,\n \"acc_norm\": 0.3627450980392157,\n\
\ \"acc_norm_stderr\": 0.047840607041056527\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n\
\ \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n\
\ \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.046854730419077895,\n\
\ \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.046854730419077895\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"\
acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305527,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305527\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.04451807959055328,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.04451807959055328\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7516129032258064,\n \"acc_stderr\": 0.024580028921481003,\n \"\
acc_norm\": 0.7516129032258064,\n \"acc_norm_stderr\": 0.024580028921481003\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586808,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586808\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593542,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593542\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188704,\n \
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188704\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612917,\n \"\
acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612917\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078962,\n\
\ \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078962\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069425,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069425\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507332,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507332\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368985,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368985\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n\
\ \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36312849162011174,\n\
\ \"acc_stderr\": 0.016083749986853697,\n \"acc_norm\": 0.36312849162011174,\n\
\ \"acc_norm_stderr\": 0.016083749986853697\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.024748624490537368,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.024748624490537368\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666907,\n \
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666907\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n\
\ \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n\
\ \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983576,\n\
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983576\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495155,\n \
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495155\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.031267817146631786,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.031267817146631786\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3463892288861689,\n\
\ \"mc1_stderr\": 0.01665699710912514,\n \"mc2\": 0.49923681207340576,\n\
\ \"mc2_stderr\": 0.01551504317540587\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7916337805840569,\n \"acc_stderr\": 0.011414554399987726\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.18278104026845637,\n \
\ \"em_stderr\": 0.003957987703151033,\n \"f1\": 0.27069211409396043,\n\
\ \"f1_stderr\": 0.004030013722161818\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.16224412433661864,\n \"acc_stderr\": 0.010155130880393524\n\
\ }\n}\n```"
repo_url: https://huggingface.co/migtissera/Tess-XS-v1.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|arc:challenge|25_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|arc:challenge|25_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|drop|3_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|drop|3_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|gsm8k|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|gsm8k|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hellaswag|10_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hellaswag|10_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T08-35-10.663595.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T08-39-10.846213.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T08-39-10.846213.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- '**/details_harness|winogrande|5_2023-11-23T08-35-10.663595.parquet'
- split: 2023_11_23T08_39_10.846213
path:
- '**/details_harness|winogrande|5_2023-11-23T08-39-10.846213.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-23T08-39-10.846213.parquet'
- config_name: results
data_files:
- split: 2023_11_23T08_35_10.663595
path:
- results_2023-11-23T08-35-10.663595.parquet
- split: 2023_11_23T08_39_10.846213
path:
- results_2023-11-23T08-39-10.846213.parquet
- split: latest
path:
- results_2023-11-23T08-39-10.846213.parquet
---
# Dataset Card for Evaluation run of migtissera/Tess-XS-v1.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/migtissera/Tess-XS-v1.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [migtissera/Tess-XS-v1.1](https://huggingface.co/migtissera/Tess-XS-v1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_migtissera__Tess-XS-v1.1_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-23T08:39:10.846213](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-XS-v1.1_public/blob/main/results_2023-11-23T08-39-10.846213.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6253362884117736,
"acc_stderr": 0.03254975101958803,
"acc_norm": 0.6343561981840767,
"acc_norm_stderr": 0.0332634036672251,
"mc1": 0.3463892288861689,
"mc1_stderr": 0.01665699710912514,
"mc2": 0.49923681207340576,
"mc2_stderr": 0.01551504317540587,
"em": 0.18278104026845637,
"em_stderr": 0.003957987703151033,
"f1": 0.27069211409396043,
"f1_stderr": 0.004030013722161818
},
"harness|arc:challenge|25": {
"acc": 0.5930034129692833,
"acc_stderr": 0.014356399418009126,
"acc_norm": 0.6390784982935154,
"acc_norm_stderr": 0.014034761386175452
},
"harness|hellaswag|10": {
"acc": 0.6512646883091018,
"acc_stderr": 0.004755960559929163,
"acc_norm": 0.8405696076478789,
"acc_norm_stderr": 0.003653288043555801
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.049020713000019756,
"acc_norm": 0.61,
"acc_norm_stderr": 0.049020713000019756
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.02535574126305527,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.02535574126305527
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.04451807959055328,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.04451807959055328
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7516129032258064,
"acc_stderr": 0.024580028921481003,
"acc_norm": 0.7516129032258064,
"acc_norm_stderr": 0.024580028921481003
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586808,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586808
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593542,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593542
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.02995382389188704,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.02995382389188704
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612917,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612917
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078962,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078962
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069425,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069425
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507332,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368985,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368985
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.02433214677913413,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.02433214677913413
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36312849162011174,
"acc_stderr": 0.016083749986853697,
"acc_norm": 0.36312849162011174,
"acc_norm_stderr": 0.016083749986853697
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.025494259350694912,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.025494259350694912
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.024748624490537368,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.024748624490537368
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666907,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666907
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4706649282920469,
"acc_stderr": 0.012748238397365549,
"acc_norm": 0.4706649282920469,
"acc_norm_stderr": 0.012748238397365549
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983576,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983576
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495155,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495155
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.02916273841024977,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.02916273841024977
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3463892288861689,
"mc1_stderr": 0.01665699710912514,
"mc2": 0.49923681207340576,
"mc2_stderr": 0.01551504317540587
},
"harness|winogrande|5": {
"acc": 0.7916337805840569,
"acc_stderr": 0.011414554399987726
},
"harness|drop|3": {
"em": 0.18278104026845637,
"em_stderr": 0.003957987703151033,
"f1": 0.27069211409396043,
"f1_stderr": 0.004030013722161818
},
"harness|gsm8k|5": {
"acc": 0.16224412433661864,
"acc_stderr": 0.010155130880393524
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
tyzhu/find_last_sent_train_400_eval_40 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 1071032
num_examples: 840
- name: validation
num_bytes: 41250
num_examples: 40
download_size: 0
dataset_size: 1112282
---
# Dataset Card for "find_last_sent_train_400_eval_40"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DynamicSuperb/vocalSoundRecognition_vocalSound | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: file
dtype: string
- name: instruction
dtype: string
- name: label
dtype: string
splits:
- name: test
num_bytes: 1427179683.235
num_examples: 3591
download_size: 1107141703
dataset_size: 1427179683.235
---
# Dataset Card for "vocalSoundRecognition_vocalSound"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ilhamxx/Receipt_dataset | ---
license: unknown
---
|
amine-khelif/MLX_ds | ---
dataset_info:
features:
- name: questions
dtype: string
- name: answers
dtype: string
- name: documentation
dtype: string
splits:
- name: train
num_bytes: 9097858
num_examples: 2983
download_size: 2727586
dataset_size: 9097858
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
FaalSa/cluster0_5 | ---
dataset_info:
features:
- name: start
dtype: timestamp[s]
- name: target
sequence: float32
- name: item_id
dtype: string
- name: feat_static_cat
sequence: uint64
splits:
- name: train
num_bytes: 1775340
num_examples: 45
- name: validation
num_bytes: 1796940
num_examples: 45
- name: test
num_bytes: 1818540
num_examples: 45
download_size: 1560855
dataset_size: 5390820
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
ThWu/cleaned_prompt_r_2 | ---
dataset_info:
features:
- name: conversations
sequence: string
splits:
- name: train
num_bytes: 156626089.82132664
num_examples: 266593
download_size: 90801143
dataset_size: 156626089.82132664
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_Nitral-AI__Eris_PrimeV4-Vision-7B | ---
pretty_name: Evaluation run of Nitral-AI/Eris_PrimeV4-Vision-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Nitral-AI/Eris_PrimeV4-Vision-7B](https://huggingface.co/Nitral-AI/Eris_PrimeV4-Vision-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Nitral-AI__Eris_PrimeV4-Vision-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-27T17:42:31.708199](https://huggingface.co/datasets/open-llm-leaderboard/details_Nitral-AI__Eris_PrimeV4-Vision-7B/blob/main/results_2024-03-27T17-42-31.708199.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6544670895145546,\n\
\ \"acc_stderr\": 0.03203812543812197,\n \"acc_norm\": 0.654814407682698,\n\
\ \"acc_norm_stderr\": 0.03269448680671681,\n \"mc1\": 0.5091799265605875,\n\
\ \"mc1_stderr\": 0.017500550724819753,\n \"mc2\": 0.6776099528014666,\n\
\ \"mc2_stderr\": 0.01487299617879486\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6757679180887372,\n \"acc_stderr\": 0.013678810399518824,\n\
\ \"acc_norm\": 0.7022184300341296,\n \"acc_norm_stderr\": 0.01336308010724448\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6950806612228639,\n\
\ \"acc_stderr\": 0.004594323838650357,\n \"acc_norm\": 0.8756223859788886,\n\
\ \"acc_norm_stderr\": 0.003293374019781595\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.0253795249107784,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.0253795249107784\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n\
\ \"acc_stderr\": 0.022891687984554963,\n \"acc_norm\": 0.7967741935483871,\n\
\ \"acc_norm_stderr\": 0.022891687984554963\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6794871794871795,\n \"acc_stderr\": 0.023661296393964273,\n\
\ \"acc_norm\": 0.6794871794871795,\n \"acc_norm_stderr\": 0.023661296393964273\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465066,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465066\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8275229357798165,\n \"acc_stderr\": 0.016197807956848036,\n \"\
acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.016197807956848036\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n\
\ \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371807,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371807\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.02370309952525817,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.02370309952525817\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4122905027932961,\n\
\ \"acc_stderr\": 0.01646320023811452,\n \"acc_norm\": 0.4122905027932961,\n\
\ \"acc_norm_stderr\": 0.01646320023811452\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02526169121972948,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02526169121972948\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959603,\n\
\ \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959603\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n\
\ \"acc_stderr\": 0.01274920600765747,\n \"acc_norm\": 0.47131681877444587,\n\
\ \"acc_norm_stderr\": 0.01274920600765747\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5091799265605875,\n\
\ \"mc1_stderr\": 0.017500550724819753,\n \"mc2\": 0.6776099528014666,\n\
\ \"mc2_stderr\": 0.01487299617879486\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8168902920284136,\n \"acc_stderr\": 0.010869778633168362\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6785443517816527,\n \
\ \"acc_stderr\": 0.012864471384836705\n }\n}\n```"
repo_url: https://huggingface.co/Nitral-AI/Eris_PrimeV4-Vision-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|arc:challenge|25_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|gsm8k|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hellaswag|10_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T17-42-31.708199.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T17-42-31.708199.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- '**/details_harness|winogrande|5_2024-03-27T17-42-31.708199.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-27T17-42-31.708199.parquet'
- config_name: results
data_files:
- split: 2024_03_27T17_42_31.708199
path:
- results_2024-03-27T17-42-31.708199.parquet
- split: latest
path:
- results_2024-03-27T17-42-31.708199.parquet
---
# Dataset Card for Evaluation run of Nitral-AI/Eris_PrimeV4-Vision-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Nitral-AI/Eris_PrimeV4-Vision-7B](https://huggingface.co/Nitral-AI/Eris_PrimeV4-Vision-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Nitral-AI__Eris_PrimeV4-Vision-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-27T17:42:31.708199](https://huggingface.co/datasets/open-llm-leaderboard/details_Nitral-AI__Eris_PrimeV4-Vision-7B/blob/main/results_2024-03-27T17-42-31.708199.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6544670895145546,
"acc_stderr": 0.03203812543812197,
"acc_norm": 0.654814407682698,
"acc_norm_stderr": 0.03269448680671681,
"mc1": 0.5091799265605875,
"mc1_stderr": 0.017500550724819753,
"mc2": 0.6776099528014666,
"mc2_stderr": 0.01487299617879486
},
"harness|arc:challenge|25": {
"acc": 0.6757679180887372,
"acc_stderr": 0.013678810399518824,
"acc_norm": 0.7022184300341296,
"acc_norm_stderr": 0.01336308010724448
},
"harness|hellaswag|10": {
"acc": 0.6950806612228639,
"acc_stderr": 0.004594323838650357,
"acc_norm": 0.8756223859788886,
"acc_norm_stderr": 0.003293374019781595
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.0253795249107784,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.0253795249107784
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.022891687984554963,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.022891687984554963
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.023661296393964273,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.023661296393964273
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465066,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465066
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.016197807956848036,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.016197807956848036
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371807,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371807
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.02370309952525817,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.02370309952525817
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4122905027932961,
"acc_stderr": 0.01646320023811452,
"acc_norm": 0.4122905027932961,
"acc_norm_stderr": 0.01646320023811452
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02526169121972948,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02526169121972948
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959603,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959603
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47131681877444587,
"acc_stderr": 0.01274920600765747,
"acc_norm": 0.47131681877444587,
"acc_norm_stderr": 0.01274920600765747
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5091799265605875,
"mc1_stderr": 0.017500550724819753,
"mc2": 0.6776099528014666,
"mc2_stderr": 0.01487299617879486
},
"harness|winogrande|5": {
"acc": 0.8168902920284136,
"acc_stderr": 0.010869778633168362
},
"harness|gsm8k|5": {
"acc": 0.6785443517816527,
"acc_stderr": 0.012864471384836705
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
narenb7/cool_new_dataset | ---
dataset_info:
features:
- name: name
dtype: string
- name: description
dtype: string
- name: ad
dtype: string
splits:
- name: train
num_bytes: 10463
num_examples: 24
download_size: 10493
dataset_size: 10463
---
# Dataset Card for "cool_new_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
generated_reviews_enth | ---
annotations_creators:
- expert-generated
- machine-generated
language_creators:
- machine-generated
language:
- en
- th
license:
- cc-by-sa-4.0
multilinguality:
- translation
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- translation
- text-classification
task_ids:
- multi-class-classification
- semantic-similarity-classification
pretty_name: generated_reviews_enth
dataset_info:
features:
- name: translation
dtype:
translation:
languages:
- en
- th
- name: review_star
dtype: int32
- name: correct
dtype:
class_label:
names:
'0': neg
'1': pos
config_name: generated_reviews_enth
splits:
- name: train
num_bytes: 147673215
num_examples: 141369
- name: validation
num_bytes: 16409966
num_examples: 15708
- name: test
num_bytes: 18133523
num_examples: 17453
download_size: 59490601
dataset_size: 182216704
---
# Dataset Card for generated_reviews_enth
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** ttp://airesearch.in.th/
- **Repository:** https://github.com/vistec-ai/generated_reviews_enth
- **Paper:** https://arxiv.org/pdf/2007.03541.pdf
- **Leaderboard:**
- **Point of Contact:** [AIResearch](http://airesearch.in.th/)
### Dataset Summary
`generated_reviews_enth` is created as part of [scb-mt-en-th-2020](https://arxiv.org/pdf/2007.03541.pdf) for machine translation task. This dataset (referred to as `generated_reviews_yn` in [scb-mt-en-th-2020](https://arxiv.org/pdf/2007.03541.pdf)) are English product reviews generated by [CTRL](https://arxiv.org/abs/1909.05858), translated by Google Translate API and annotated as accepted or rejected (`correct`) based on fluency and adequacy of the translation by human annotators. This allows it to be used for English-to-Thai translation quality esitmation (binary label), machine translation, and sentiment analysis.
### Supported Tasks and Leaderboards
English-to-Thai translation quality estimation (binary label) is the intended use. Other uses include machine translation and sentiment analysis.
### Languages
English, Thai
## Dataset Structure
### Data Instances
```
{'correct': 0, 'review_star': 4, 'translation': {'en': "I had a hard time finding a case for my new LG Lucid 2 but finally found this one on amazon. The colors are really pretty and it works just as well as, if not better than the otterbox. Hopefully there will be more available by next Xmas season. Overall, very cute case. I love cheetah's. :)", 'th': 'เธเธฑเธเธกเธตเธเธฑเธเธซเธฒเนเธเธเธฒเธฃเธซเธฒเนเธเธชเธชเธณเธซเธฃเธฑเธ LG Lucid 2 เนเธซเธกเนเธเธญเธเธเธฑเธ เนเธเนเนเธเธเธตเนเธชเธธเธเธเนเธเธเนเธเธชเธเธตเนเนเธ Amazon เธชเธตเธชเธงเธขเธกเธฒเธเนเธฅเธฐเนเธเนเธเธฒเธเนเธเนเธเธตเนเธเนเธเนเธเธตเธขเธงเธเธฑเธเธเนเธฒเนเธกเนเธเธตเธเธงเนเธฒเธเธฒเธ เธซเธงเธฑเธเธงเนเธฒเธเธฐเธกเธตเนเธซเนเธกเธฒเธเธเธถเนเธเนเธเธเนเธงเธเนเธเธจเธเธฒเธฅเธเธฃเธดเธชเธเนเธกเธฒเธชเธซเธเนเธฒ เนเธเธขเธฃเธงเธกเนเธฅเนเธงเธเนเธฒเธฃเธฑเธเธกเธฒเธ เน เธเธฑเธเธฃเธฑเธเนเธชเธทเธญเธเธตเธเธฒเธซเน :)'}}
{'correct': 0, 'review_star': 1, 'translation': {'en': "This is the second battery charger I bought as a Christmas present, that came from Amazon, after one purchased before for my son. His was still working. The first charger, received in July, broke apart and wouldn't charge anymore. Just found out two days ago they discontinued it without warning. It took quite some time to find the exact replacement charger. Too bad, really liked it. One of these days, will purchase an actual Nikon product, or go back to buying batteries.", 'th': 'เธเธตเนเนเธเนเธเนเธเธฃเธทเนเธญเธเธเธฒเธฃเนเธเนเธเธเนเธเธญเธฃเธตเนเธเนเธญเธเธเธตเนเธชเธญเธเธเธตเนเธเธฑเธเธเธทเนเธญเนเธเนเธเธเธญเธเธเธงเธฑเธเธเธฃเธดเธชเธเนเธกเธฒเธชเธเธถเนเธเธกเธฒเธเธฒเธเธญเนเธกเธเธญเธเธซเธฅเธฑเธเธเธฒเธเธเธตเนเธเธทเนเธญเธกเธฒเนเธเธทเนเธญเธฅเธนเธเธเธฒเธขเธเธญเธเธเธฑเธ เนเธเธฒเธขเธฑเธเธเธณเธเธฒเธเธญเธขเธนเน เนเธเธฃเธทเนเธญเธเธเธฒเธฃเนเธเนเธฃเธเธเธตเนเนเธเนเธฃเธฑเธเนเธเนเธเธทเธญเธเธเธฃเธเธเธฒเธเธกเนเธเธเนเธเนเธเธเธดเนเธ เน เนเธฅเธฐเธเธฐเนเธกเนเธเธฒเธฃเนเธเธญเธตเธเธเนเธญเนเธ เนเธเธดเนเธเธเนเธเธเธเนเธกเธทเนเธญเธชเธญเธเธงเธฑเธเธเนเธญเธเธเธงเธเนเธเธฒเธซเธขเธธเธเธกเธฑเธเนเธเธขเนเธกเนเธกเธตเธเธฒเธฃเนเธเธทเธญเธเธฅเนเธงเธเธซเธเนเธฒ เนเธเนเนเธงเธฅเธฒเธเธญเธชเธกเธเธงเธฃเนเธเธเธฒเธฃเธซเธฒเธเธตเนเธเธฒเธฃเนเธเธเธตเนเธเธนเธเธเนเธญเธ เนเธขเนเธกเธฒเธเธเธญเธเธกเธฒเธ เธชเธฑเธเธงเธฑเธเธซเธเธถเนเธเธเธฐเธเธทเนเธญเธเธฅเธดเธเธ เธฑเธเธเน Nikon เธเธฃเธดเธเธซเธฃเธทเธญเธเธฅเธฑเธเนเธเธเธทเนเธญเนเธเธเนเธเธญเธฃเธตเน'}}
{'correct': 1, 'review_star': 1, 'translation': {'en': 'I loved the idea of having a portable computer to share pictures with family and friends on my big screen. It worked really well for about 3 days, then when i opened it one evening there was water inside where all the wires came out. I cleaned that up and put some tape over that, so far, no leaks. My husband just told me yesterday, however, that this thing is trash.', 'th': 'เธเธฑเธเธเธญเธเนเธญเนเธเธตเธขเธเธตเนเธกเธตเธเธญเธกเธเธดเธงเนเธเธญเธฃเนเธเธเธเธฒเนเธเธทเนเธญเนเธเธฃเนเธฃเธนเธเธ เธฒเธเธเธฑเธเธเธฃเธญเธเธเธฃเธฑเธงเนเธฅเธฐเนเธเธทเนเธญเธ เน เธเธเธซเธเนเธฒเธเธญเธเธเธฒเธเนเธซเธเนเธเธญเธเธเธฑเธ เธกเธฑเธเนเธเนเธเธฒเธเนเธเนเธเธตเธเธฃเธดเธ เน เธเธฃเธฐเธกเธฒเธ 3 เธงเธฑเธเธเธฒเธเธเธฑเนเธเนเธกเธทเนเธญเธเธฑเธเนเธเธดเธเธกเธฑเธเนเธเนเธขเนเธเธงเธฑเธเธซเธเธถเนเธเธกเธตเธเนเธณเธญเธขเธนเนเธ เธฒเธขเนเธเธเธตเนเธเธถเนเธเธชเธฒเธขเนเธเธเธฑเนเธเธซเธกเธเธญเธญเธเธกเธฒ เธเธฑเธเธเธณเธเธงเธฒเธกเธชเธฐเธญเธฒเธเธกเธฑเธเนเธฅเนเธงเธงเธฒเธเนเธเธเนเธงเนเธเธตเนเธเธฑเนเธเธเธเธเธถเธเธเธญเธเธเธตเนเนเธกเนเธกเธตเธฃเธญเธขเธฃเธฑเนเธง เธชเธฒเธกเธตเธเธญเธเธเธฑเธเนเธเธดเนเธเธเธญเธเธเธฑเธเนเธกเธทเนเธญเธงเธฒเธเธเธตเนเธงเนเธฒเธชเธดเนเธเธเธตเนเนเธเนเธเธเธขเธฐ'}}
```
### Data Fields
- `translation`:
- `en`: English product reviews generated by [CTRL](https://arxiv.org/abs/1909.05858)
- `th`: Thai product reviews translated from `en` by Google Translate API
- `review_star`: Stars of the generated reviews, put as condition for [CTRL](https://arxiv.org/abs/1909.05858)
- `correct`: 1 if the English-to-Thai translation is accepted (`correct`) based on fluency and adequacy of the translation by human annotators else 0
### Data Splits
| | train | valid | test |
|-----------------|--------|-------|-------|
| # samples | 141369 | 15708 | 17453 |
| # correct:0 | 99296 | 10936 | 12208 |
| # correct:1 | 42073 | 4772 | 5245 |
| # review_star:1 | 50418 | 5628 | 6225 |
| # review_star:2 | 22876 | 2596 | 2852 |
| # review_star:3 | 22825 | 2521 | 2831 |
| # review_star:1 | 22671 | 2517 | 2778 |
| # review_star:5 | 22579 | 2446 | 2767 |
## Dataset Creation
### Curation Rationale
`generated_reviews_enth` is created as part of [scb-mt-en-th-2020](https://arxiv.org/pdf/2007.03541.pdf) for machine translation task. This dataset (referred to as `generated_reviews_yn` in [scb-mt-en-th-2020](https://arxiv.org/pdf/2007.03541.pdf)) are English product reviews generated by [CTRL](https://arxiv.org/abs/1909.05858), translated by Google Translate API and annotated as accepted or rejected (`correct`) based on fluency and adequacy of the translation by human annotators. This allows it to be used for English-to-Thai translation quality esitmation (binary label), machine translation, and sentiment analysis.
### Source Data
#### Initial Data Collection and Normalization
The data generation process is as follows:
- `en` is generated using conditional generation of [CTRL](https://arxiv.org/abs/1909.05858), stating a star review for each generated product review.
- `th` is translated from `en` using Google Translate API
- `correct` is annotated as accepted or rejected (1 or 0) based on fluency and adequacy of the translation by human annotators
For this specific dataset for translation quality estimation task, we apply the following preprocessing:
- Drop duplciates on `en`,`th`,`review_star`,`correct`; duplicates might exist because the translation checking is done by annotators.
- Remove reviews that are not between 1-5 stars.
- Remove reviews whose `correct` are not 0 or 1.
- Deduplicate on `en` which contains the source sentences.
#### Who are the source language producers?
[CTRL](https://arxiv.org/abs/1909.05858)
### Annotations
#### Annotation process
Annotators are given English and Thai product review pairs. They are asked to label the pair as acceptable translation or not based on fluency and adequacy of the translation.
#### Who are the annotators?
Human annotators of [Hope Data Annotations](https://www.hopedata.org/) hired by [AIResearch.in.th](http://airesearch.in.th/)
### Personal and Sensitive Information
The authors do not expect any personal or sensitive information to be in the generated product reviews, but they could slip through from pretraining of [CTRL](https://arxiv.org/abs/1909.05858).
## Considerations for Using the Data
### Social Impact of Dataset
- English-Thai translation quality estimation for machine translation
- Product review classification for Thai
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
Due to annotation process constraints, the number of one-star reviews are notably higher than other-star reviews. This makes the dataset slighly imbalanced.
## Additional Information
### Dataset Curators
The dataset was created by [AIResearch.in.th](http://airesearch.in.th/)
### Licensing Information
CC BY-SA 4.0
### Citation Information
```
@article{lowphansirikul2020scb,
title={scb-mt-en-th-2020: A Large English-Thai Parallel Corpus},
author={Lowphansirikul, Lalita and Polpanumas, Charin and Rutherford, Attapol T and Nutanong, Sarana},
journal={arXiv preprint arXiv:2007.03541},
year={2020}
}
```
### Contributions
Thanks to [@cstorm125](https://github.com/cstorm125) for adding this dataset. |
DialogueCharacter/english_wizard_unfiltered | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: response
sequence: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 278812623
num_examples: 121930
download_size: 144938153
dataset_size: 278812623
---
# Dataset Card for "english_wizard_unfiltered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
update0909/hf-stack-zyx | ---
dataset_info:
features:
- name: repo_id
dtype: string
- name: file_path
dtype: string
- name: content
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 114334795
num_examples: 7212
download_size: 38900746
dataset_size: 114334795
---
# Dataset Card for "hf-stack-zyx"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
universalner/uner_llm_inst_chinese | ---
license: cc-by-sa-4.0
language:
- zh
task_categories:
- token-classification
dataset_info:
- config_name: zh_pud
splits:
- name: test
num_examples: 999
- config_name: zh_gsd
splits:
- name: test
num_examples: 499
- name: dev
num_examples: 499
- name: train
num_examples: 3996
- config_name: zh_gsdsimp
splits:
- name: test
num_examples: 499
- name: dev
num_examples: 499
- name: train
num_examples: 3996
---
# Dataset Card for Universal NER v1 in the Aya format - Chinese subset
This dataset is a format conversion for the Chinese data in the original Universal NER v1 into the Aya instruction format and it's released here under the same CC-BY-SA 4.0 license and conditions.
The dataset contains different subsets and their dev/test/train splits, depending on language. For more details, please refer to:
## Dataset Details
For the original Universal NER dataset v1 and more details, please check https://huggingface.co/datasets/universalner/universal_ner.
For details on the conversion to the Aya instructions format, please see the complete version: https://huggingface.co/datasets/universalner/uner_llm_instructions
## Citation
If you utilize this dataset version, feel free to cite/footnote the complete version at https://huggingface.co/datasets/universalner/uner_llm_instructions, but please also cite the *original dataset publication*.
**BibTeX:**
```
@preprint{mayhew2023universal,
title={{Universal NER: A Gold-Standard Multilingual Named Entity Recognition Benchmark}},
author={Stephen Mayhew and Terra Blevins and Shuheng Liu and Marek ล uppa and Hila Gonen and Joseph Marvin Imperial and Bรถrje F. Karlsson and Peiqin Lin and Nikola Ljubeลกiฤ and LJ Miranda and Barbara Plank and Arij Riabi and Yuval Pinter},
year={2023},
eprint={2311.09122},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
open-llm-leaderboard/details_NobodyExistsOnTheInternet__clown-SUV-4x70b | ---
pretty_name: Evaluation run of NobodyExistsOnTheInternet/clown-SUV-4x70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NobodyExistsOnTheInternet/clown-SUV-4x70b](https://huggingface.co/NobodyExistsOnTheInternet/clown-SUV-4x70b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NobodyExistsOnTheInternet__clown-SUV-4x70b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-03T13:18:34.281350](https://huggingface.co/datasets/open-llm-leaderboard/details_NobodyExistsOnTheInternet__clown-SUV-4x70b/blob/main/results_2024-02-03T13-18-34.281350.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2424611028294509,\n\
\ \"acc_stderr\": 0.03031610890226359,\n \"acc_norm\": 0.2428216424826148,\n\
\ \"acc_norm_stderr\": 0.031122287295333263,\n \"mc1\": 0.22888616891064872,\n\
\ \"mc1_stderr\": 0.014706994909055027,\n \"mc2\": 0.48811709450476065,\n\
\ \"mc2_stderr\": 0.016595901285138773\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.20136518771331058,\n \"acc_stderr\": 0.011718927477444265,\n\
\ \"acc_norm\": 0.24744027303754265,\n \"acc_norm_stderr\": 0.01261035266329267\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.26777534355706034,\n\
\ \"acc_stderr\": 0.004418948941099406,\n \"acc_norm\": 0.28291177056363276,\n\
\ \"acc_norm_stderr\": 0.00449493402546234\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322674,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322674\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.15555555555555556,\n\
\ \"acc_stderr\": 0.031309483648783144,\n \"acc_norm\": 0.15555555555555556,\n\
\ \"acc_norm_stderr\": 0.031309483648783144\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.23026315789473684,\n \"acc_stderr\": 0.03426059424403165,\n\
\ \"acc_norm\": 0.23026315789473684,\n \"acc_norm_stderr\": 0.03426059424403165\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.22641509433962265,\n \"acc_stderr\": 0.025757559893106758,\n\
\ \"acc_norm\": 0.22641509433962265,\n \"acc_norm_stderr\": 0.025757559893106758\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2013888888888889,\n\
\ \"acc_stderr\": 0.033536474697138406,\n \"acc_norm\": 0.2013888888888889,\n\
\ \"acc_norm_stderr\": 0.033536474697138406\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n\
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n\
\ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n\
\ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.042207736591714534,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.042207736591714534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.23,\n\
\ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2170212765957447,\n \"acc_stderr\": 0.026947483121496228,\n\
\ \"acc_norm\": 0.2170212765957447,\n \"acc_norm_stderr\": 0.026947483121496228\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489358,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489358\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.03806142687309993,\n\
\ \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.03806142687309993\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2804232804232804,\n \"acc_stderr\": 0.02313528797432563,\n \"\
acc_norm\": 0.2804232804232804,\n \"acc_norm_stderr\": 0.02313528797432563\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25161290322580643,\n\
\ \"acc_stderr\": 0.024685979286239973,\n \"acc_norm\": 0.25161290322580643,\n\
\ \"acc_norm_stderr\": 0.024685979286239973\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.22167487684729065,\n \"acc_stderr\": 0.0292255758924896,\n\
\ \"acc_norm\": 0.22167487684729065,\n \"acc_norm_stderr\": 0.0292255758924896\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\"\
: 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.30303030303030304,\n \"acc_stderr\": 0.035886248000917075,\n\
\ \"acc_norm\": 0.30303030303030304,\n \"acc_norm_stderr\": 0.035886248000917075\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.20707070707070707,\n \"acc_stderr\": 0.02886977846026705,\n \"\
acc_norm\": 0.20707070707070707,\n \"acc_norm_stderr\": 0.02886977846026705\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.27461139896373055,\n \"acc_stderr\": 0.032210245080411565,\n\
\ \"acc_norm\": 0.27461139896373055,\n \"acc_norm_stderr\": 0.032210245080411565\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.23076923076923078,\n \"acc_stderr\": 0.021362027725222724,\n\
\ \"acc_norm\": 0.23076923076923078,\n \"acc_norm_stderr\": 0.021362027725222724\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23703703703703705,\n \"acc_stderr\": 0.025928876132766124,\n \
\ \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.025928876132766124\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.19327731092436976,\n \"acc_stderr\": 0.025649470265889197,\n\
\ \"acc_norm\": 0.19327731092436976,\n \"acc_norm_stderr\": 0.025649470265889197\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23178807947019867,\n \"acc_stderr\": 0.034454062719870546,\n \"\
acc_norm\": 0.23178807947019867,\n \"acc_norm_stderr\": 0.034454062719870546\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24587155963302754,\n \"acc_stderr\": 0.018461940968708443,\n \"\
acc_norm\": 0.24587155963302754,\n \"acc_norm_stderr\": 0.018461940968708443\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2361111111111111,\n \"acc_stderr\": 0.028963702570791044,\n \"\
acc_norm\": 0.2361111111111111,\n \"acc_norm_stderr\": 0.028963702570791044\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.3088235294117647,\n \"acc_stderr\": 0.03242661719827218,\n \"\
acc_norm\": 0.3088235294117647,\n \"acc_norm_stderr\": 0.03242661719827218\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.23628691983122363,\n \"acc_stderr\": 0.027652153144159256,\n \
\ \"acc_norm\": 0.23628691983122363,\n \"acc_norm_stderr\": 0.027652153144159256\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.21524663677130046,\n\
\ \"acc_stderr\": 0.027584066602208263,\n \"acc_norm\": 0.21524663677130046,\n\
\ \"acc_norm_stderr\": 0.027584066602208263\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2231404958677686,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.2231404958677686,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2085889570552147,\n \"acc_stderr\": 0.03192193448934722,\n\
\ \"acc_norm\": 0.2085889570552147,\n \"acc_norm_stderr\": 0.03192193448934722\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.04327040932578729,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.04327040932578729\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.21359223300970873,\n \"acc_stderr\": 0.04058042015646033,\n\
\ \"acc_norm\": 0.21359223300970873,\n \"acc_norm_stderr\": 0.04058042015646033\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24786324786324787,\n\
\ \"acc_stderr\": 0.028286324075564393,\n \"acc_norm\": 0.24786324786324787,\n\
\ \"acc_norm_stderr\": 0.028286324075564393\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2515964240102171,\n\
\ \"acc_stderr\": 0.015517322365529638,\n \"acc_norm\": 0.2515964240102171,\n\
\ \"acc_norm_stderr\": 0.015517322365529638\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.0218552552634218,\n\
\ \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.0218552552634218\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217892,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217892\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.02463004897982476,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.02463004897982476\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24115755627009647,\n\
\ \"acc_stderr\": 0.024296594034763426,\n \"acc_norm\": 0.24115755627009647,\n\
\ \"acc_norm_stderr\": 0.024296594034763426\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.24468085106382978,\n \"acc_stderr\": 0.025645553622266733,\n \"\
acc_norm\": 0.24468085106382978,\n \"acc_norm_stderr\": 0.025645553622266733\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24185136897001303,\n\
\ \"acc_stderr\": 0.010936550813827056,\n \"acc_norm\": 0.24185136897001303,\n\
\ \"acc_norm_stderr\": 0.010936550813827056\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3014705882352941,\n \"acc_stderr\": 0.027875982114273168,\n\
\ \"acc_norm\": 0.3014705882352941,\n \"acc_norm_stderr\": 0.027875982114273168\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2581699346405229,\n \"acc_stderr\": 0.017704531653250075,\n \
\ \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.017704531653250075\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n\
\ \"acc_stderr\": 0.04013964554072773,\n \"acc_norm\": 0.22727272727272727,\n\
\ \"acc_norm_stderr\": 0.04013964554072773\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.22857142857142856,\n \"acc_stderr\": 0.026882144922307748,\n\
\ \"acc_norm\": 0.22857142857142856,\n \"acc_norm_stderr\": 0.026882144922307748\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.30845771144278605,\n\
\ \"acc_stderr\": 0.032658195885126966,\n \"acc_norm\": 0.30845771144278605,\n\
\ \"acc_norm_stderr\": 0.032658195885126966\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.27710843373493976,\n\
\ \"acc_stderr\": 0.034843315926805875,\n \"acc_norm\": 0.27710843373493976,\n\
\ \"acc_norm_stderr\": 0.034843315926805875\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.27485380116959063,\n \"acc_stderr\": 0.03424042924691584,\n\
\ \"acc_norm\": 0.27485380116959063,\n \"acc_norm_stderr\": 0.03424042924691584\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22888616891064872,\n\
\ \"mc1_stderr\": 0.014706994909055027,\n \"mc2\": 0.48811709450476065,\n\
\ \"mc2_stderr\": 0.016595901285138773\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5248618784530387,\n \"acc_stderr\": 0.01403510288362775\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/NobodyExistsOnTheInternet/clown-SUV-4x70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|arc:challenge|25_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|gsm8k|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hellaswag|10_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T13-18-34.281350.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-03T13-18-34.281350.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- '**/details_harness|winogrande|5_2024-02-03T13-18-34.281350.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-03T13-18-34.281350.parquet'
- config_name: results
data_files:
- split: 2024_02_03T13_18_34.281350
path:
- results_2024-02-03T13-18-34.281350.parquet
- split: latest
path:
- results_2024-02-03T13-18-34.281350.parquet
---
# Dataset Card for Evaluation run of NobodyExistsOnTheInternet/clown-SUV-4x70b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NobodyExistsOnTheInternet/clown-SUV-4x70b](https://huggingface.co/NobodyExistsOnTheInternet/clown-SUV-4x70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NobodyExistsOnTheInternet__clown-SUV-4x70b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-03T13:18:34.281350](https://huggingface.co/datasets/open-llm-leaderboard/details_NobodyExistsOnTheInternet__clown-SUV-4x70b/blob/main/results_2024-02-03T13-18-34.281350.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2424611028294509,
"acc_stderr": 0.03031610890226359,
"acc_norm": 0.2428216424826148,
"acc_norm_stderr": 0.031122287295333263,
"mc1": 0.22888616891064872,
"mc1_stderr": 0.014706994909055027,
"mc2": 0.48811709450476065,
"mc2_stderr": 0.016595901285138773
},
"harness|arc:challenge|25": {
"acc": 0.20136518771331058,
"acc_stderr": 0.011718927477444265,
"acc_norm": 0.24744027303754265,
"acc_norm_stderr": 0.01261035266329267
},
"harness|hellaswag|10": {
"acc": 0.26777534355706034,
"acc_stderr": 0.004418948941099406,
"acc_norm": 0.28291177056363276,
"acc_norm_stderr": 0.00449493402546234
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322674,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322674
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.15555555555555556,
"acc_stderr": 0.031309483648783144,
"acc_norm": 0.15555555555555556,
"acc_norm_stderr": 0.031309483648783144
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.23026315789473684,
"acc_stderr": 0.03426059424403165,
"acc_norm": 0.23026315789473684,
"acc_norm_stderr": 0.03426059424403165
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22641509433962265,
"acc_stderr": 0.025757559893106758,
"acc_norm": 0.22641509433962265,
"acc_norm_stderr": 0.025757559893106758
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2013888888888889,
"acc_stderr": 0.033536474697138406,
"acc_norm": 0.2013888888888889,
"acc_norm_stderr": 0.033536474697138406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.042207736591714534,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.042207736591714534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2170212765957447,
"acc_stderr": 0.026947483121496228,
"acc_norm": 0.2170212765957447,
"acc_norm_stderr": 0.026947483121496228
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489358,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489358
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.296551724137931,
"acc_stderr": 0.03806142687309993,
"acc_norm": 0.296551724137931,
"acc_norm_stderr": 0.03806142687309993
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2804232804232804,
"acc_stderr": 0.02313528797432563,
"acc_norm": 0.2804232804232804,
"acc_norm_stderr": 0.02313528797432563
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25161290322580643,
"acc_stderr": 0.024685979286239973,
"acc_norm": 0.25161290322580643,
"acc_norm_stderr": 0.024685979286239973
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.22167487684729065,
"acc_stderr": 0.0292255758924896,
"acc_norm": 0.22167487684729065,
"acc_norm_stderr": 0.0292255758924896
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.30303030303030304,
"acc_stderr": 0.035886248000917075,
"acc_norm": 0.30303030303030304,
"acc_norm_stderr": 0.035886248000917075
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.20707070707070707,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.20707070707070707,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.27461139896373055,
"acc_stderr": 0.032210245080411565,
"acc_norm": 0.27461139896373055,
"acc_norm_stderr": 0.032210245080411565
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23076923076923078,
"acc_stderr": 0.021362027725222724,
"acc_norm": 0.23076923076923078,
"acc_norm_stderr": 0.021362027725222724
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.025928876132766124,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.025928876132766124
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.19327731092436976,
"acc_stderr": 0.025649470265889197,
"acc_norm": 0.19327731092436976,
"acc_norm_stderr": 0.025649470265889197
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23178807947019867,
"acc_stderr": 0.034454062719870546,
"acc_norm": 0.23178807947019867,
"acc_norm_stderr": 0.034454062719870546
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24587155963302754,
"acc_stderr": 0.018461940968708443,
"acc_norm": 0.24587155963302754,
"acc_norm_stderr": 0.018461940968708443
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.028963702570791044,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.028963702570791044
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3088235294117647,
"acc_stderr": 0.03242661719827218,
"acc_norm": 0.3088235294117647,
"acc_norm_stderr": 0.03242661719827218
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.23628691983122363,
"acc_stderr": 0.027652153144159256,
"acc_norm": 0.23628691983122363,
"acc_norm_stderr": 0.027652153144159256
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.21524663677130046,
"acc_stderr": 0.027584066602208263,
"acc_norm": 0.21524663677130046,
"acc_norm_stderr": 0.027584066602208263
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2231404958677686,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.2231404958677686,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2085889570552147,
"acc_stderr": 0.03192193448934722,
"acc_norm": 0.2085889570552147,
"acc_norm_stderr": 0.03192193448934722
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578729,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578729
},
"harness|hendrycksTest-management|5": {
"acc": 0.21359223300970873,
"acc_stderr": 0.04058042015646033,
"acc_norm": 0.21359223300970873,
"acc_norm_stderr": 0.04058042015646033
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24786324786324787,
"acc_stderr": 0.028286324075564393,
"acc_norm": 0.24786324786324787,
"acc_norm_stderr": 0.028286324075564393
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2515964240102171,
"acc_stderr": 0.015517322365529638,
"acc_norm": 0.2515964240102171,
"acc_norm_stderr": 0.015517322365529638
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.0218552552634218,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.0218552552634218
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217892,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217892
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.02463004897982476,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.02463004897982476
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24115755627009647,
"acc_stderr": 0.024296594034763426,
"acc_norm": 0.24115755627009647,
"acc_norm_stderr": 0.024296594034763426
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.25,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24468085106382978,
"acc_stderr": 0.025645553622266733,
"acc_norm": 0.24468085106382978,
"acc_norm_stderr": 0.025645553622266733
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24185136897001303,
"acc_stderr": 0.010936550813827056,
"acc_norm": 0.24185136897001303,
"acc_norm_stderr": 0.010936550813827056
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3014705882352941,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.3014705882352941,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.017704531653250075,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.017704531653250075
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072773,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072773
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22857142857142856,
"acc_stderr": 0.026882144922307748,
"acc_norm": 0.22857142857142856,
"acc_norm_stderr": 0.026882144922307748
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.30845771144278605,
"acc_stderr": 0.032658195885126966,
"acc_norm": 0.30845771144278605,
"acc_norm_stderr": 0.032658195885126966
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.27710843373493976,
"acc_stderr": 0.034843315926805875,
"acc_norm": 0.27710843373493976,
"acc_norm_stderr": 0.034843315926805875
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.27485380116959063,
"acc_stderr": 0.03424042924691584,
"acc_norm": 0.27485380116959063,
"acc_norm_stderr": 0.03424042924691584
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22888616891064872,
"mc1_stderr": 0.014706994909055027,
"mc2": 0.48811709450476065,
"mc2_stderr": 0.016595901285138773
},
"harness|winogrande|5": {
"acc": 0.5248618784530387,
"acc_stderr": 0.01403510288362775
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
coastalcph/medical-bios | ---
license: cc-by-nc-sa-4.0
task_categories:
- text-classification
language:
- en
tags:
- medical
pretty_name: medical-bios
size_categories:
- 1K<n<10K
---
# Dataset Description
The dataset comprises English biographies labeled with occupations and binary genders.
This is an occupation classification task, where bias concerning gender can be studied.
It includes a subset of 10,000 biographies (8k train/1k dev/1k test) targeting 5 medical occupations (psychologist, surgeon, nurse, dentist, physician), derived from De-Arteaga et al. (2019).
We collect and release human rationale annotations for a subset of 100 biographies in two different settings: non-contrastive and contrastive.
In the former, the annotators were asked to find the rationale for the question: "Why is the person in the following short bio described as a L?", where L is the gold label occupation, e.g., nurse.
In the latter, the question was "Why is the person in the following short bio described as an L rather than an F", where F (foil) is another medical occupation, e.g., physician.
You can read more details on the dataset and the annotation process in the paper [Eberle et al. (2023)](https://arxiv.org/abs/2310.11906).
# Dataset Structure
We provide the `standard` version of the dataset, where examples look as follows.
```json
{
"text": "He has been a practicing Dentist for 20 years. He has done BDS. He is currently associated with Sree Sai Dental Clinic in Sowkhya Ayurveda Speciality Clinic, Chennai. ... ",
"label": 3,
}
```
and the newly curated subset of examples including human rationales, dubbed `rationales', where examples look as follows.
```json
{
"text": "'She is currently practising at Dr Ravindra Ratolikar Dental Clinic in Narayanguda, Hyderabad.",
"label": 3,
"foil": 2,
"words": ['She', 'is', 'currently', 'practising', 'at', 'Dr', 'Ravindra', 'Ratolikar', 'Dental', 'Clinic', 'in', 'Narayanguda', ',', 'Hyderabad', '.']
"rationales": [0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
"contrastive_rationales": [0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
"annotations": [[0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], ...]
"contrastive_annotations": [[0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], ...]
}
```
# Use
To load the `standard` version of the dataset:
```python
from datasets import load_dataset
dataset = load_dataset("coastalcph/medical-bios", "standard")
```
To load the newly curated subset of examples with human rationales:
```python
from datasets import load_dataset
dataset = load_dataset("coastalcph/medical-bios", "rationales")
```
# Citation
[*Oliver Eberle\*, Ilias Chalkidis\*, Laura Cabello, Stephanie Brandl. Rather a Nurse than a Physician - Contrastive Explanations under Investigation. 2023. In the Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing. Singapore.*](https://aclanthology.org/2023.emnlp-main.427)
```
@inproceedings{eberle-etal-2023-rather,
title = "Rather a Nurse than a Physician - Contrastive Explanations under Investigation",
author = "Eberle, Oliver and
Chalkidis, Ilias and
Cabello, Laura and
Brandl, Stephanie",
booktitle = "Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing",
year = "2023",
address = "Singapore",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.emnlp-main.427",
}
```
|
Toygar/turkish-offensive-language-detection | ---
annotations_creators:
- crowdsourced
- expert-generated
language_creators:
- crowdsourced
language:
- tr
license:
- cc-by-2.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets: []
task_categories:
- text-classification
task_ids: []
pretty_name: Turkish Offensive Language Detection Dataset
tags:
- offensive-language-classification
---
# Dataset Summary
This dataset is enhanced version of existing offensive language studies. Existing studies are highly imbalanced, and solving this problem is too costly. To solve this, we proposed contextual data mining method for dataset augmentation. Our method is basically prevent us from retrieving random tweets and label individually. We can directly access almost exact hate related tweets and label them directly without any further human interaction in order to solve imbalanced label problem.
In addition, existing studies *(can be found at Reference section)* are merged to create even more comprehensive and robust dataset for Turkish offensive language detection task.
The file train.csv contains 42,398, test.csv contains 8,851, valid.csv contains 1,756 annotated tweets.
# Dataset Structure
A binary dataset with with (0) Not Offensive and (1) Offensive tweets.
### Task and Labels
Offensive language identification:
- (0) Not Offensive - Tweet does not contain offense or profanity.
- (1) Offensive - Tweet contains offensive language or a targeted (veiled or direct) offense
### Data Splits
| | train | test | dev |
|------:|:------|:-----|:-----|
| 0 (Not Offensive) | 22,589 | 4,436 | 1,402 |
| 1 (Offensive) | 19,809 | 4,415 | 354 |
### Citation Information
```
T. Tanyel, B. Alkurdi and S. Ayvaz, "Linguistic-based Data Augmentation Approach for Offensive Language Detection," 2022 7th International Conference on Computer Science and Engineering (UBMK), 2022, pp. 1-6, doi: 10.1109/UBMK55850.2022.9919562.
```
### Paper codes
https://github.com/tanyelai/lingda
# References
We merged open-source offensive language dataset studies in Turkish to increase contextuality with existing data even more, before our method is applied.
- https://huggingface.co/datasets/offenseval2020_tr
- https://github.com/imayda/turkish-hate-speech-dataset-2
- https://www.kaggle.com/datasets/kbulutozler/5k-turkish-tweets-with-incivil-content
|
thorirhrafn/icesum | ---
dataset_info:
features:
- name: Title
dtype: string
- name: Text
dtype: string
- name: Summary
dtype: string
splits:
- name: train
num_bytes: 2317552
num_examples: 900
- name: eval
num_bytes: 129982
num_examples: 50
- name: test
num_bytes: 136397
num_examples: 50
download_size: 1641880
dataset_size: 2583931
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: eval
path: data/eval-*
- split: test
path: data/test-*
---
|
semeru/code-text-php | ---
license: mit
Programminglanguage: php
version: '7 or later'
Date: Codesearchnet(Jun 2020 - paper release date)
Contaminated: Very Likely
Size: Standar Tokenizer (TreeSitter)
---
### Dataset is imported from CodeXGLUE and pre-processed using their script.
# Where to find in Semeru:
The dataset can be found at /nfs/semeru/semeru_datasets/code_xglue/code-to-text/php in Semeru
# CodeXGLUE -- Code-To-Text
## Task Definition
The task is to generate natural language comments for a code, and evaluted by [smoothed bleu-4](https://www.aclweb.org/anthology/C04-1072.pdf) score.
## Dataset
The dataset we use comes from [CodeSearchNet](https://arxiv.org/pdf/1909.09436.pdf) and we filter the dataset as the following:
- Remove examples that codes cannot be parsed into an abstract syntax tree.
- Remove examples that #tokens of documents is < 3 or >256
- Remove examples that documents contain special tokens (e.g. <img ...> or https:...)
- Remove examples that documents are not English.
### Data Format
After preprocessing dataset, you can obtain three .jsonl files, i.e. train.jsonl, valid.jsonl, test.jsonl
For each file, each line in the uncompressed file represents one function. One row is illustrated below.
- **repo:** the owner/repo
- **path:** the full path to the original file
- **func_name:** the function or method name
- **original_string:** the raw string before tokenization or parsing
- **language:** the programming language
- **code/function:** the part of the `original_string` that is code
- **code_tokens/function_tokens:** tokenized version of `code`
- **docstring:** the top-level comment or docstring, if it exists in the original string
- **docstring_tokens:** tokenized version of `docstring`
### Data Statistic
| Programming Language | Training | Dev | Test |
| :------------------- | :------: | :----: | :----: |
| PHP | 241,241 | 12,982 | 14,014 |
## Reference
<pre><code>@article{husain2019codesearchnet,
title={Codesearchnet challenge: Evaluating the state of semantic code search},
author={Husain, Hamel and Wu, Ho-Hsiang and Gazit, Tiferet and Allamanis, Miltiadis and Brockschmidt, Marc},
journal={arXiv preprint arXiv:1909.09436},
year={2019}
}</code></pre>
|
HeshamHaroon/QA_Arabic | ---
language:
- "ar"
pretty_name: "Questions and Answers Dataset in Arabic"
tags:
- "question-answer"
- "language-learning"
- "chatbot"
license: "apache-2.0"
task_categories:
- "question-answering"
- "text-generation"
- "text2text-generation"
---
# JSON File Description
## Overview
This JSON file contains a collection of questions and answers in Arabic. Each question is associated with its corresponding answer. The file is structured in a way that allows easy retrieval and utilization of the question-answer pairs.
## File Structure
The JSON file follows the following structure:
```json
{
"questions": [
{
"question": "ู
ู ูู ุฃูู ู
ู ูุฒู ุนูู ุณุทุญ ุงููู
ุฑุ",
"answer": "ููู ุฃู
ุณุชุฑููุฌ"
},
{
"question": "ูู
ุนุฏุฏ ุงูุฃุณูุงู ูู ูู
ุงูุฅูุณุงู ุงูุนุงุฏูุ",
"answer": "32 ุณูุง"
},
{
"question": "ูู
ุนุฏุฏ ุฃุนูู ุงูุฐุจุงุจุฉุ",
"answer": "5 ุฃุนูู"
},
{
"question": "ูู
ุนุฏุฏ ุฃุฑุฌู ุงูุนููุจูุชุ",
"answer": "ุฌ4 - 8 ุฃุฑุฌู"
},
{
"question": "ุณ5 - ู
ุงุฐุง ูุณู
ู ุจูุช ุงููู
ูุ",
"answer": "ุฌ5 - ูุฑูุฉ ุงููู
ู"
},
{
"question": "ุณ6 - ูู
ุนุธู
ุฉ ุชูุฌุฏ ูู ุฌุณู
ุงูุฅูุณุงูุ",
"answer": "ุฌ6 - 206 ุนุธู
ุงุช"
},
...
]
}
The file consists of a single object with one key, "questions," which contains an array of question-answer pairs. Each question-answer pair is represented as an object with two keys: "question" and "answer".
Usage:
- Question-Answer Retrieval: Parse the JSON file and access the question-answer pairs programmatically to retrieve specific questions and their corresponding answers.
- Language Learning: Utilize the question-answer pairs to develop language learning applications or quizzes where users can practice answering questions in Arabic.
- Chatbot Integration: Integrate the JSON file with a chatbot system to provide automated responses based on the questions and answers available.
Feel free to modify the JSON file by adding more question-answer pairs or use it as a reference to create your own question-answer datasets.
Contributing:
If you have additional questions and answers that you would like to contribute to this JSON file, please feel free to submit a pull request. Your contributions are greatly appreciated!
|
coastalcph/xlingual_mpararel_autorr | ---
dataset_info:
features:
- name: id
dtype: string
- name: language
dtype: string
- name: relation
dtype: string
- name: template
dtype: string
- name: template_id
dtype: int64
- name: query
dtype: string
- name: sub_uri
dtype: string
- name: obj_uri
dtype: string
- name: obj_label
sequence: string
- name: sub_label
dtype: string
- name: lineid
dtype: int64
splits:
- name: train
num_bytes: 41418096.73819433
num_examples: 203027
download_size: 6943431
dataset_size: 41418096.73819433
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ilist | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- awa
- bho
- bra
- hi
- mag
license:
- cc-by-4.0
multilinguality:
- multilingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-classification
task_ids: []
pretty_name: ilist
tags:
- language-identification
dataset_info:
features:
- name: language_id
dtype:
class_label:
names:
'0': AWA
'1': BRA
'2': MAG
'3': BHO
'4': HIN
- name: text
dtype: string
splits:
- name: train
num_bytes: 14362998
num_examples: 70351
- name: test
num_bytes: 2146857
num_examples: 9692
- name: validation
num_bytes: 2407643
num_examples: 10329
download_size: 18284850
dataset_size: 18917498
---
# Dataset Card for ilist
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:** https://github.com/kmi-linguistics/vardial2018
- **Paper:** [Language Identification and Morphosyntactic Tagging: The Second VarDial Evaluation Campaign](https://aclanthology.org/W18-3901/)
- **Leaderboard:**
- **Point of Contact:** linguistics.kmi@gmail.com
### Dataset Summary
This dataset is introduced in a task which aimed at identifying 5 closely-related languages of Indo-Aryan language family: Hindi (also known as Khari Boli), Braj Bhasha, Awadhi, Bhojpuri and Magahi. These languages form part of a continuum starting from Western Uttar Pradesh (Hindi and Braj Bhasha) to Eastern Uttar Pradesh (Awadhi and Bhojpuri) and the neighbouring Eastern state of Bihar (Bhojpuri and Magahi).
For this task, participants were provided with a dataset of approximately 15,000 sentences in each language, mainly from the domain of literature, published over the web as well as in print.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
Hindi, Braj Bhasha, Awadhi, Bhojpuri and Magahi
## Dataset Structure
### Data Instances
```
{
"language_id": 4,
"text": 'เคคเคญเฅ เคฌเคพเคฐเคฟเคถ เคนเฅเค เคฅเฅ เคเคฟเคธเคเคพ เคเฅเคฒเคพเคชเคจ เคเคจ เคฎเฅเคฐเฅเคคเคฟเคฏเฅเค เคเฅ เคเคจ เคคเคธเฅเคตเฅเคฐเฅเค เคฎเฅเค เคเค เค
เคฒเค เคฐเฅเคช เคฆเฅเคคเคพ เคนเฅ .'
}
```
### Data Fields
- `text`: text which you want to classify
- `language_id`: label for the text as an integer from 0 to 4
The language ids correspond to the following languages: "AWA", "BRA", "MAG", "BHO", "HIN".
### Data Splits
| | train | valid | test |
|----------------------|-------|-------|-------|
| # of input sentences | 70351 | 9692 | 10329 |
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
The data for this task was collected from both hard printed and digital sources. Printed materials were
obtained from different institutions that promote these languages. We also gathered data from libraries,
as well as from local literary and cultural groups. We collected printed stories, novels and essays in
books, magazines, and newspapers.
#### Initial Data Collection and Normalization
We scanned the printed materials, then we performed OCR, and
finally we asked native speakers of the respective languages to correct the OCR output. Since there are
no specific OCR models available for these languages, we used the Google OCR for Hindi, part of the
Drive API. Since all the languages used the Devanagari script, we expected the OCR to work reasonably
well, and overall it did. We further managed to get some blogs in Magahi and Bhojpuri.
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
This work is licensed under a Creative Commons Attribution 4.0 International License: http://creativecommons.org/licenses/by/4.0/
### Citation Information
```
@inproceedings{zampieri-etal-2018-language,
title = "Language Identification and Morphosyntactic Tagging: The Second {V}ar{D}ial Evaluation Campaign",
author = {Zampieri, Marcos and
Malmasi, Shervin and
Nakov, Preslav and
Ali, Ahmed and
Shon, Suwon and
Glass, James and
Scherrer, Yves and
Samard{\v{z}}i{\'c}, Tanja and
Ljube{\v{s}}i{\'c}, Nikola and
Tiedemann, J{\"o}rg and
van der Lee, Chris and
Grondelaers, Stefan and
Oostdijk, Nelleke and
Speelman, Dirk and
van den Bosch, Antal and
Kumar, Ritesh and
Lahiri, Bornini and
Jain, Mayank},
booktitle = "Proceedings of the Fifth Workshop on {NLP} for Similar Languages, Varieties and Dialects ({V}ar{D}ial 2018)",
month = aug,
year = "2018",
address = "Santa Fe, New Mexico, USA",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/W18-3901",
pages = "1--17",
}
```
### Contributions
Thanks to [@vasudevgupta7](https://github.com/vasudevgupta7) for adding this dataset. |
gayanin/babylon-native-v8 | ---
dataset_info:
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 682164
num_examples: 3893
- name: test
num_bytes: 77596
num_examples: 487
- name: validation
num_bytes: 73087
num_examples: 487
download_size: 462975
dataset_size: 832847
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
infinityofspace/python_codestyles-mixed1-500 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: code
dtype: string
- name: code_codestyle
dtype: int64
- name: style_context
dtype: string
- name: style_context_codestyle
dtype: int64
- name: label
dtype: int64
splits:
- name: train
num_bytes: 1794945328.216033
num_examples: 153992
- name: test
num_bytes: 326644128.3197262
num_examples: 28194
download_size: 645473358
dataset_size: 2121589456.5357592
license: mit
tags:
- python
- code-style
- mixed
size_categories:
- 100K<n<1M
---
# Dataset Card for "python_codestyles-mixed1-500"
This dataset contains negative and positive examples with python code of compliance with a code style. A positive
example represents compliance with the code style (label is 1). Each example is composed of two components, the first
component consists of a code that either conforms to the code style or violates it and the second component
corresponding to an example code that already conforms to a code style.
The dataset combines both
datasets [infinityofspace/python_codestyles-random-500](https://huggingface.co/datasets/infinityofspace/python_codestyles-random-500)
and [infinityofspace/python_codestyles-single-500](https://huggingface.co/datasets/infinityofspace/python_codestyles-single-500)
by randomly selecting half of the examples from each of the two datasets.
The code styles in the combined dataset differ in at least one and exactly one codestyle rule, which is called a
`mixed` codestyle dataset variant. The dataset consists of a training and test group, with none of the code styles
overlapping between groups. In addition, both groups contain completely different underlying codes.
The examples contain source code from the following repositories:
| repository | tag or commit |
|:-----------------------------------------------------------------------:|:----------------------------------------:|
| [TheAlgorithms/Python](https://github.com/TheAlgorithms/Python) | f614ed72170011d2d439f7901e1c8daa7deac8c4 |
| [huggingface/transformers](https://github.com/huggingface/transformers) | v4.31.0 |
| [huggingface/datasets](https://github.com/huggingface/datasets) | 2.13.1 |
| [huggingface/diffusers](https://github.com/huggingface/diffusers) | v0.18.2 |
| [huggingface/accelerate](https://github.com/huggingface/accelerate) | v0.21.0 | |
pesc101/spyder-ide-lbl-all-4x | ---
dataset_info:
features:
- name: meta_data
struct:
- name: contains_class
dtype: bool
- name: contains_function
dtype: bool
- name: end_line
dtype: int64
- name: file_imports
sequence: string
- name: file_name
dtype: string
- name: module
dtype: string
- name: start_line
dtype: int64
- name: code
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 118052457
num_examples: 31281
download_size: 33479565
dataset_size: 118052457
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
lambdalabs/pokemon-blip-captions | ---
license: cc-by-nc-sa-4.0
annotations_creators:
- machine-generated
language:
- en
language_creators:
- other
multilinguality:
- monolingual
pretty_name: 'Pokรฉmon BLIP captions'
size_categories:
- n<1K
source_datasets:
- huggan/few-shot-pokemon
tags: []
task_categories:
- text-to-image
task_ids: []
---
# Notice of DMCA Takedown Action
We have received a DMCA takedown notice from The Pokรฉmon Company International, Inc.
In response to this action, we have taken down the dataset.
We appreciate your understanding. |
danrop/photo.ai | ---
license: openrail
---
|
DZN222/gabe | ---
license: openrail
---
|
Lambent/sonnet_preferences_dpo | ---
license: apache-2.0
---
|
one-sec-cv12/chunk_59 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 24813040320.5
num_examples: 258340
download_size: 21904748473
dataset_size: 24813040320.5
---
# Dataset Card for "chunk_59"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_rizla__rizla55b | ---
pretty_name: Evaluation run of rizla/rizla55b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [rizla/rizla55b](https://huggingface.co/rizla/rizla55b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rizla__rizla55b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T09:00:01.266295](https://huggingface.co/datasets/open-llm-leaderboard/details_rizla__rizla55b/blob/main/results_2024-02-02T09-00-01.266295.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.629966070509496,\n\
\ \"acc_stderr\": 0.0328646103693209,\n \"acc_norm\": 0.6377234122481039,\n\
\ \"acc_norm_stderr\": 0.03355760561589414,\n \"mc1\": 0.38555691554467564,\n\
\ \"mc1_stderr\": 0.01703883901059167,\n \"mc2\": 0.5559179467355304,\n\
\ \"mc2_stderr\": 0.015414641498233956\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5554607508532423,\n \"acc_stderr\": 0.01452122640562707,\n\
\ \"acc_norm\": 0.6032423208191127,\n \"acc_norm_stderr\": 0.014296513020180628\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5973909579764987,\n\
\ \"acc_stderr\": 0.004894210011303203,\n \"acc_norm\": 0.8042222664807808,\n\
\ \"acc_norm_stderr\": 0.003959872578165267\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03583496176361074,\n\
\ \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03583496176361074\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.029647813539365245,\n\
\ \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.029647813539365245\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n\
\ \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n\
\ \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n\
\ \"acc_stderr\": 0.03758517775404948,\n \"acc_norm\": 0.5838150289017341,\n\
\ \"acc_norm_stderr\": 0.03758517775404948\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4656084656084656,\n \"acc_stderr\": 0.025690321762493838,\n \"\
acc_norm\": 0.4656084656084656,\n \"acc_norm_stderr\": 0.025690321762493838\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5238095238095238,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.5238095238095238,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7387096774193549,\n \"acc_stderr\": 0.024993053397764826,\n \"\
acc_norm\": 0.7387096774193549,\n \"acc_norm_stderr\": 0.024993053397764826\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4482758620689655,\n \"acc_stderr\": 0.03499113137676744,\n \"\
acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.03499113137676744\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386417,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386417\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306426,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306426\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.02424378399406217,\n \
\ \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.02424378399406217\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7226890756302521,\n \"acc_stderr\": 0.029079374539480007,\n\
\ \"acc_norm\": 0.7226890756302521,\n \"acc_norm_stderr\": 0.029079374539480007\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4105960264900662,\n \"acc_stderr\": 0.04016689594849928,\n \"\
acc_norm\": 0.4105960264900662,\n \"acc_norm_stderr\": 0.04016689594849928\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.818348623853211,\n \"acc_stderr\": 0.01653061740926688,\n \"acc_norm\"\
: 0.818348623853211,\n \"acc_norm_stderr\": 0.01653061740926688\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5601851851851852,\n\
\ \"acc_stderr\": 0.0338517797604481,\n \"acc_norm\": 0.5601851851851852,\n\
\ \"acc_norm_stderr\": 0.0338517797604481\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n\
\ \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8227848101265823,\n \"acc_stderr\": 0.024856364184503217,\n \
\ \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.024856364184503217\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.039153454088478354,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.039153454088478354\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7905982905982906,\n\
\ \"acc_stderr\": 0.026655699653922744,\n \"acc_norm\": 0.7905982905982906,\n\
\ \"acc_norm_stderr\": 0.026655699653922744\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.789272030651341,\n\
\ \"acc_stderr\": 0.014583812465862545,\n \"acc_norm\": 0.789272030651341,\n\
\ \"acc_norm_stderr\": 0.014583812465862545\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.02500931379006971,\n\
\ \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.02500931379006971\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4692737430167598,\n\
\ \"acc_stderr\": 0.016690896161944385,\n \"acc_norm\": 0.4692737430167598,\n\
\ \"acc_norm_stderr\": 0.016690896161944385\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279056,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279056\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967308,\n\
\ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967308\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5123859191655802,\n\
\ \"acc_stderr\": 0.012766317315473551,\n \"acc_norm\": 0.5123859191655802,\n\
\ \"acc_norm_stderr\": 0.012766317315473551\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.028814722422254177,\n\
\ \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.028814722422254177\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7009803921568627,\n \"acc_stderr\": 0.018521756215423024,\n \
\ \"acc_norm\": 0.7009803921568627,\n \"acc_norm_stderr\": 0.018521756215423024\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128445,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128445\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n\
\ \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.8009950248756219,\n\
\ \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38555691554467564,\n\
\ \"mc1_stderr\": 0.01703883901059167,\n \"mc2\": 0.5559179467355304,\n\
\ \"mc2_stderr\": 0.015414641498233956\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7884767166535123,\n \"acc_stderr\": 0.01147774768422318\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.26838514025777105,\n \
\ \"acc_stderr\": 0.01220570268801367\n }\n}\n```"
repo_url: https://huggingface.co/rizla/rizla55b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|arc:challenge|25_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|gsm8k|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hellaswag|10_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T09-00-01.266295.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T09-00-01.266295.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- '**/details_harness|winogrande|5_2024-02-02T09-00-01.266295.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T09-00-01.266295.parquet'
- config_name: results
data_files:
- split: 2024_02_02T09_00_01.266295
path:
- results_2024-02-02T09-00-01.266295.parquet
- split: latest
path:
- results_2024-02-02T09-00-01.266295.parquet
---
# Dataset Card for Evaluation run of rizla/rizla55b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [rizla/rizla55b](https://huggingface.co/rizla/rizla55b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_rizla__rizla55b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T09:00:01.266295](https://huggingface.co/datasets/open-llm-leaderboard/details_rizla__rizla55b/blob/main/results_2024-02-02T09-00-01.266295.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.629966070509496,
"acc_stderr": 0.0328646103693209,
"acc_norm": 0.6377234122481039,
"acc_norm_stderr": 0.03355760561589414,
"mc1": 0.38555691554467564,
"mc1_stderr": 0.01703883901059167,
"mc2": 0.5559179467355304,
"mc2_stderr": 0.015414641498233956
},
"harness|arc:challenge|25": {
"acc": 0.5554607508532423,
"acc_stderr": 0.01452122640562707,
"acc_norm": 0.6032423208191127,
"acc_norm_stderr": 0.014296513020180628
},
"harness|hellaswag|10": {
"acc": 0.5973909579764987,
"acc_stderr": 0.004894210011303203,
"acc_norm": 0.8042222664807808,
"acc_norm_stderr": 0.003959872578165267
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03583496176361074,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03583496176361074
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6339622641509434,
"acc_stderr": 0.029647813539365245,
"acc_norm": 0.6339622641509434,
"acc_norm_stderr": 0.029647813539365245
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.037738099906869334,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.037738099906869334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.03758517775404948,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.03758517775404948
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4656084656084656,
"acc_stderr": 0.025690321762493838,
"acc_norm": 0.4656084656084656,
"acc_norm_stderr": 0.025690321762493838
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5238095238095238,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.5238095238095238,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7387096774193549,
"acc_stderr": 0.024993053397764826,
"acc_norm": 0.7387096774193549,
"acc_norm_stderr": 0.024993053397764826
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.03499113137676744,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.03499113137676744
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386417,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306426,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306426
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.02424378399406217,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.02424378399406217
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7226890756302521,
"acc_stderr": 0.029079374539480007,
"acc_norm": 0.7226890756302521,
"acc_norm_stderr": 0.029079374539480007
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4105960264900662,
"acc_stderr": 0.04016689594849928,
"acc_norm": 0.4105960264900662,
"acc_norm_stderr": 0.04016689594849928
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.818348623853211,
"acc_stderr": 0.01653061740926688,
"acc_norm": 0.818348623853211,
"acc_norm_stderr": 0.01653061740926688
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.0338517797604481,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.0338517797604481
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588663,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588663
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.024856364184503217,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.024856364184503217
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7905982905982906,
"acc_stderr": 0.026655699653922744,
"acc_norm": 0.7905982905982906,
"acc_norm_stderr": 0.026655699653922744
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.789272030651341,
"acc_stderr": 0.014583812465862545,
"acc_norm": 0.789272030651341,
"acc_norm_stderr": 0.014583812465862545
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.02500931379006971,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.02500931379006971
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4692737430167598,
"acc_stderr": 0.016690896161944385,
"acc_norm": 0.4692737430167598,
"acc_norm_stderr": 0.016690896161944385
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279056,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279056
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967308,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967308
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5123859191655802,
"acc_stderr": 0.012766317315473551,
"acc_norm": 0.5123859191655802,
"acc_norm_stderr": 0.012766317315473551
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.028814722422254177,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.028814722422254177
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7009803921568627,
"acc_stderr": 0.018521756215423024,
"acc_norm": 0.7009803921568627,
"acc_norm_stderr": 0.018521756215423024
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128445,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128445
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8009950248756219,
"acc_stderr": 0.028231365092758406,
"acc_norm": 0.8009950248756219,
"acc_norm_stderr": 0.028231365092758406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.38555691554467564,
"mc1_stderr": 0.01703883901059167,
"mc2": 0.5559179467355304,
"mc2_stderr": 0.015414641498233956
},
"harness|winogrande|5": {
"acc": 0.7884767166535123,
"acc_stderr": 0.01147774768422318
},
"harness|gsm8k|5": {
"acc": 0.26838514025777105,
"acc_stderr": 0.01220570268801367
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
edwinjue/311-data-2017 | ---
license: gpl-3.0
---
|
joey234/mmlu-public_relations-neg-prepend-verbal | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
- name: neg_prompt
dtype: string
- name: fewshot_context_neg
dtype: string
- name: fewshot_context_ori
dtype: string
splits:
- name: dev
num_bytes: 7856
num_examples: 5
- name: test
num_bytes: 1015695
num_examples: 110
download_size: 152938
dataset_size: 1023551
---
# Dataset Card for "mmlu-public_relations-neg-prepend-verbal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ashg59/invoices-images | ---
license: apache-2.0
---
|
shuyuej/gsm8k_testing_promptcraft_generated | ---
license: apache-2.0
---
# Dataset Construction
The `paraphrased questions` are generated by [Prompt Craft Toolkit](https://github.com/SuperBruceJia/promptcraft).
# Dataset Usage
```python
from datasets import load_dataset
# Load dataset
dataset = load_dataset("shuyuej/gsm8k_testing_promptcraft_generated")
dataset = dataset["test"]
print(dataset)
```
# Citation
If you find our toolkit useful, please consider citing our repo and toolkit in your publications. We provide a BibTeX entry below.
```bibtex
@misc{JiaPromptCraft23,
author = {Jia, Shuyue},
title = {{PromptCraft}: A Prompt Perturbation Toolkit},
year = {2023},
publisher = {GitHub},
journal = {GitHub Repository},
howpublished = {\url{https://github.com/SuperBruceJia/promptcraft}},
}
@misc{JiaAwesomeLLM23,
author = {Jia, Shuyue},
title = {Awesome {LLM} Self-Consistency},
year = {2023},
publisher = {GitHub},
journal = {GitHub Repository},
howpublished = {\url{https://github.com/SuperBruceJia/Awesome-LLM-Self-Consistency}},
}
@misc{JiaAwesomeSTS23,
author = {Jia, Shuyue},
title = {Awesome Semantic Textual Similarity},
year = {2023},
publisher = {GitHub},
journal = {GitHub Repository},
howpublished = {\url{https://github.com/SuperBruceJia/Awesome-Semantic-Textual-Similarity}},
}
``` |
Allenbv/Jojos-bizarre-diffusiondataset | ---
license: creativeml-openrail-m
---
|
andy-fang/andy_portraits | ---
license: mit
---
|
Santhosh-kumar/hi | ---
license: mit
---
|
Rewcifer/ct_scans_90pct_2000_cutoff_llama | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 821780150.4561067
num_examples: 164551
download_size: 148829636
dataset_size: 821780150.4561067
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ct_scans_90pct_2000_cutoff_llama"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
theophilusijiebor1/chest-xray | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': NORMAL
'1': PNEUMONIA
splits:
- name: train
num_bytes: 3186635036.504
num_examples: 5216
- name: validation
num_bytes: 3030633.0
num_examples: 16
- name: test
num_bytes: 79062317.0
num_examples: 624
download_size: 1230487171
dataset_size: 3268727986.504
---
# Dataset Card for "chest-xray"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FreedomIntelligence/ALLaVA-4V-Chinese | ---
license: apache-2.0
task_categories:
- question-answering
- text-generation
language:
- zh
tags:
- GPT-4V
- LVLM
- Vision
- Language
size_categories:
- 1M<n<10M
configs:
- config_name: allava_laion
data_files:
- split: caption
path: "allava_laion/ALLaVA-Caption-LAION-4V_Chinese.json"
- split: instruct
path: "allava_laion/ALLaVA-Instruct-LAION-4V_Chinese.json"
- config_name: allava_vflan
data_files:
- split: caption
path: "allava_vflan/ALLaVA-Caption-VFLAN-4V_Chinese.json"
- split: instruct
path: "allava_vflan/ALLaVA-Instruct-VFLAN-4V_Chinese.json"
# - config_name: allava_laion_instruction
# data_files: "allava_laion/ALLaVA-Instruct-LAION-4V.json"
# configs:
# - config_name: default
# data_files:
# - split: allava_laion_caption
# path: "allava_laion/ALLaVA-Caption-LAION-4V.json"
# - split: allava_laion_instruction
# path: "allava_laion/ALLaVA-Instruction-LAION-4V.json"
# configs:
# - config_name: default
# - data_files:
# - split: allava_laion_caption
# - path:
# - "allava_laion/ALLaVA-Caption-LAION-4V.json"
# - split: allava_laion_instruction
# - path:
# - "allava_laion/ALLaVA-Instruction-LAION-4V.json"
---
## ALLaVA-4V for Chinese
This is the Chinese version of the ALLaVA-4V data. We have translated the ALLaVA-4V data into Chinese through ChatGPT and instructed ChatGPT not to translate content related to OCR.
The original dataset can be found [here](https://huggingface.co/datasets/FreedomIntelligence/ALLaVA-4V), and the image data can be downloaded from [ALLaVA-4V](https://huggingface.co/datasets/FreedomIntelligence/ALLaVA-4V).
#### Citation
If you find our data useful, please consider citing our work! We are FreedomIntelligence from Shenzhen Research Institute of Big Data and The Chinese University of Hong Kong, Shenzhen.
```
@misc{chen2024allava,
title={ALLaVA: Harnessing GPT4V-synthesized Data for A Lite Vision-Language Model},
author={Guiming Hardy Chen and Shunian Chen and Ruifei Zhang and Junying Chen and Xiangbo Wu and Zhiyi Zhang and Zhihong Chen and Jianquan Li and Xiang Wan and Benyou Wang},
year={2024},
eprint={2402.11684},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-25000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 997171
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/b1b692c4 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 180
num_examples: 10
download_size: 1340
dataset_size: 180
---
# Dataset Card for "b1b692c4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/fbd217fa | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 178
num_examples: 10
download_size: 1339
dataset_size: 178
---
# Dataset Card for "fbd217fa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ErikCikalleshi/new_york_times_news_2000_2007 | ---
dataset_info:
features:
- name: date
dtype: int64
- name: title
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 2546303272.221892
num_examples: 497249
- name: test
num_bytes: 282923154.7781082
num_examples: 55250
download_size: 1592417345
dataset_size: 2829226427.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
yangwang825/sst2-pwws-3 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: augment
dtype: string
splits:
- name: train
num_bytes: 3455887
num_examples: 27603
- name: validation
num_bytes: 110096
num_examples: 872
- name: test
num_bytes: 226340
num_examples: 1821
download_size: 1303865
dataset_size: 3792323
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
hackathon-somos-nlp-2023/podcasts-ner-es | ---
dataset_info:
features:
- name: text
dtype: string
- name: annotation
list:
- name: end
dtype: int64
- name: label
dtype: string
- name: start
dtype: int64
- name: id
dtype: string
splits:
- name: train
num_bytes: 43389.8358778626
num_examples: 209
- name: test
num_bytes: 11003.164122137405
num_examples: 53
download_size: 42448
dataset_size: 54393
task_categories:
- token-classification
language:
- es
size_categories:
- n<1K
license: mit
---
# Dataset Card for "podcasts-ner-es"
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
- [Team members](#team-members)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset comprises of small text snippets extracted from the "Deforme Semanal" podcast,
accompanied by annotations that identify the presence of a predetermined set of entities.
The purpose of this dataset is to facilitate Named Entity Recognition (NER) tasks.
The dataset was created to aid in the identification of entities such as famous people, books, or films in podcasts.
The transcription of the audio was first done, followed by annotation with GPT-3 and curation with Argilla.
The dataset is in Spanish, covering mostly topics such as love, feminism, and art, which are the main themes of the podcast.
### Supported Tasks and Leaderboards
Named Entity Recognition
### Languages
The dataset is in Spanish and the language used is primarily informal.
It is important to note that the language may include aggressive or offensive content.
## Dataset Structure
### Data Instances
```
{
"text":"Tengo 39 aรฑos, pues, ya verรฉ cuรกndo yo quiero dejar de comer ternera, estรก mal, porque hay sobre explotaciรณn y todo esto, muy mal."
"annotation": [ { "end": 13, "label": "DATES", "start": 6 } ]
"id": "53c4748e-dbd2-4cf5-946f-d134b0bf6155"
}
```
### Data Fields
`text`: Snippet of text of no more than 512 characters extracted from a podcast episode.
`id`: Unique identification number for each instance in the dataset.
`annotation`: List of dictonary-like format with the following fields:
- `end`: end character of the entity ocurrence in the text.
- `start`: start character of the entity ocurrence in the text.
- `label`: label for the entity from the predefined set of entities. The label of the entities is one of:
'people', 'products', 'books', 'animals', 'organizations', 'topics', 'dates', 'places', 'artista', 'objects','songs', and 'films'.
### Data Splits
The dataset was shuffled and split using the `train_test_split` function from the Hugging Face datasets library.
The split was made with a train size of 0.8 and a seed of 42.
## Dataset Creation
### Curation Rationale
We created this dataset with the aim of making the information from our favorite podcasts more accessible, as retrieving information from audio formats can be challenging.
We chose to focus on the Named Entity Recognition (NER) task as it was relatively easy to annotate and validate.
### Source Data
#### Initial Data Collection and Normalization
We collected the data from a playlist on YouTube containing approximately 15 episodes of the "Deforme Semanal" podcast.
You can find the playlist at this [link](https://www.youtube.com/playlist?list=PLLbN7SMQhMVZoXhtQ00AyebQE_-ttDrs9).
We then transcribed the audio stream using OpenAI's Whisper (medium size) and split the resulting text files
into chunks of less than 512 characters.
### Annotations
#### Annotation process
To annotate the texts, we used OpenAI's API and GPT-3, with the following prompt:
```
Perform named entity recognition in Spanish. The classes are books, films, video games, songs, places, dates, topics, organizations, and people. The output should follow the format:
[{'class': 'people', 'text': 'name of the person'}, {'class': 'books', 'start': 'name of the book'}]
Sentence:
```
Finally, to ensure the quality of the dataset, we validated the annotations using Argilla by checking that the tokens were classified
correctly.
## Considerations for Using the Data
### Discussion of Biases
The dataset was obtained from the "Deforme Semanal" podcast, which primarily focuses on art, feminism, and culture.
As a result, the data is directly related to the topics and individuals discussed in these contexts. Additionally,
the language used in the podcast is informal and can be aggressive or offensive at times, which may be reflected in the dataset.
Although we attempted to minimize these biases during the validation process, their effectiveness is likely limited.
### Other Known Limitations
One issue that we have encountered with the token/entity data is that there can be some ambiguity in terms of their distinctions.
In some cases, it may not be clear how to differentiate between two tokens or entities, which can impact the accuracy
and effectiveness of models trained on this data.
Furthermore, the dataset size is relatively small, which can pose a challenge when it comes to training machine learning models.
With a limited amount of data, it can be difficult to capture the full range of variations and patterns in the data,
and overfitting can become a concern. This is especially true when dealing with complex models that require a large
amount of data to train effectively.
## Team members
[David Mora](https://huggingface.co/DavidFM43)
[Sergio Perez](https://huggingface.co/sergiopperez)
[Albeto Fernandez](https://huggingface.co/AlbertoFH98)
|
apollo-research/monology-pile-uncopyrighted-tokenizer-gpt2 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 196198407000.0
num_examples: 47853270
download_size: 83576335505
dataset_size: 196198407000.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
fairlabs/custom-ner-data | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence: int64
splits:
- name: train
num_bytes: 38627882
num_examples: 36240
- name: validation
num_bytes: 9693972
num_examples: 9060
- name: test
num_bytes: 1322376
num_examples: 1000
download_size: 5410959
dataset_size: 49644230
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_DatPySci__pythia-1b-spin-iter1 | ---
pretty_name: Evaluation run of DatPySci/pythia-1b-spin-iter1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DatPySci/pythia-1b-spin-iter1](https://huggingface.co/DatPySci/pythia-1b-spin-iter1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DatPySci__pythia-1b-spin-iter1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-18T07:42:33.395938](https://huggingface.co/datasets/open-llm-leaderboard/details_DatPySci__pythia-1b-spin-iter1/blob/main/results_2024-02-18T07-42-33.395938.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24877589085779003,\n\
\ \"acc_stderr\": 0.030472375374358776,\n \"acc_norm\": 0.24988280928761106,\n\
\ \"acc_norm_stderr\": 0.031204850855553568,\n \"mc1\": 0.22399020807833536,\n\
\ \"mc1_stderr\": 0.014594964329474203,\n \"mc2\": 0.3689337478412581,\n\
\ \"mc2_stderr\": 0.014347571303045535\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.27986348122866894,\n \"acc_stderr\": 0.01311904089772592,\n\
\ \"acc_norm\": 0.3054607508532423,\n \"acc_norm_stderr\": 0.013460080478002505\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3910575582553276,\n\
\ \"acc_stderr\": 0.004869899297734552,\n \"acc_norm\": 0.49263095000995816,\n\
\ \"acc_norm_stderr\": 0.004989239462835215\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322716,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322716\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
\ \"acc_stderr\": 0.037498507091740206,\n \"acc_norm\": 0.2518518518518518,\n\
\ \"acc_norm_stderr\": 0.037498507091740206\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.15789473684210525,\n \"acc_stderr\": 0.029674167520101442,\n\
\ \"acc_norm\": 0.15789473684210525,\n \"acc_norm_stderr\": 0.029674167520101442\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n\
\ \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.27169811320754716,\n \"acc_stderr\": 0.027377706624670713,\n\
\ \"acc_norm\": 0.27169811320754716,\n \"acc_norm_stderr\": 0.027377706624670713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.26011560693641617,\n\
\ \"acc_stderr\": 0.033450369167889925,\n \"acc_norm\": 0.26011560693641617,\n\
\ \"acc_norm_stderr\": 0.033450369167889925\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.251063829787234,\n \"acc_stderr\": 0.02834696377716246,\n\
\ \"acc_norm\": 0.251063829787234,\n \"acc_norm_stderr\": 0.02834696377716246\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n\
\ \"acc_stderr\": 0.0383515395439942,\n \"acc_norm\": 0.21052631578947367,\n\
\ \"acc_norm_stderr\": 0.0383515395439942\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.20689655172413793,\n \"acc_stderr\": 0.03375672449560554,\n\
\ \"acc_norm\": 0.20689655172413793,\n \"acc_norm_stderr\": 0.03375672449560554\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.03932537680392871,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.03932537680392871\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2645161290322581,\n\
\ \"acc_stderr\": 0.02509189237885928,\n \"acc_norm\": 0.2645161290322581,\n\
\ \"acc_norm_stderr\": 0.02509189237885928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2019704433497537,\n \"acc_stderr\": 0.028247350122180277,\n\
\ \"acc_norm\": 0.2019704433497537,\n \"acc_norm_stderr\": 0.028247350122180277\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885417,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885417\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.25906735751295334,\n \"acc_stderr\": 0.03161877917935411,\n\
\ \"acc_norm\": 0.25906735751295334,\n \"acc_norm_stderr\": 0.03161877917935411\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.22564102564102564,\n \"acc_stderr\": 0.021193632525148522,\n\
\ \"acc_norm\": 0.22564102564102564,\n \"acc_norm_stderr\": 0.021193632525148522\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.02813325257881563,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.02813325257881563\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2184873949579832,\n \"acc_stderr\": 0.02684151432295895,\n \
\ \"acc_norm\": 0.2184873949579832,\n \"acc_norm_stderr\": 0.02684151432295895\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2119205298013245,\n \"acc_stderr\": 0.03336767086567977,\n \"\
acc_norm\": 0.2119205298013245,\n \"acc_norm_stderr\": 0.03336767086567977\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3100917431192661,\n \"acc_stderr\": 0.019830849684439756,\n \"\
acc_norm\": 0.3100917431192661,\n \"acc_norm_stderr\": 0.019830849684439756\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.25462962962962965,\n \"acc_stderr\": 0.029711275860005368,\n \"\
acc_norm\": 0.25462962962962965,\n \"acc_norm_stderr\": 0.029711275860005368\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24509803921568626,\n \"acc_stderr\": 0.03019028245350194,\n \"\
acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.03019028245350194\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.24472573839662448,\n \"acc_stderr\": 0.027985699387036416,\n \
\ \"acc_norm\": 0.24472573839662448,\n \"acc_norm_stderr\": 0.027985699387036416\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.33183856502242154,\n\
\ \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.33183856502242154,\n\
\ \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.20610687022900764,\n \"acc_stderr\": 0.035477710041594626,\n\
\ \"acc_norm\": 0.20610687022900764,\n \"acc_norm_stderr\": 0.035477710041594626\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.038968789850704164,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.038968789850704164\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.23148148148148148,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285713,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285713\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.22330097087378642,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.22330097087378642,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.029343114798094472,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.029343114798094472\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2707535121328225,\n\
\ \"acc_stderr\": 0.015889888362560486,\n \"acc_norm\": 0.2707535121328225,\n\
\ \"acc_norm_stderr\": 0.015889888362560486\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.26011560693641617,\n \"acc_stderr\": 0.023618678310069363,\n\
\ \"acc_norm\": 0.26011560693641617,\n \"acc_norm_stderr\": 0.023618678310069363\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24804469273743016,\n\
\ \"acc_stderr\": 0.014444157808261441,\n \"acc_norm\": 0.24804469273743016,\n\
\ \"acc_norm_stderr\": 0.014444157808261441\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.21895424836601307,\n \"acc_stderr\": 0.02367908986180772,\n\
\ \"acc_norm\": 0.21895424836601307,\n \"acc_norm_stderr\": 0.02367908986180772\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.26688102893890675,\n\
\ \"acc_stderr\": 0.02512263760881665,\n \"acc_norm\": 0.26688102893890675,\n\
\ \"acc_norm_stderr\": 0.02512263760881665\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2716049382716049,\n \"acc_stderr\": 0.02474862449053737,\n\
\ \"acc_norm\": 0.2716049382716049,\n \"acc_norm_stderr\": 0.02474862449053737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2375886524822695,\n \"acc_stderr\": 0.025389512552729903,\n \
\ \"acc_norm\": 0.2375886524822695,\n \"acc_norm_stderr\": 0.025389512552729903\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2392438070404172,\n\
\ \"acc_stderr\": 0.010896123652676651,\n \"acc_norm\": 0.2392438070404172,\n\
\ \"acc_norm_stderr\": 0.010896123652676651\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3492647058823529,\n \"acc_stderr\": 0.028959755196824855,\n\
\ \"acc_norm\": 0.3492647058823529,\n \"acc_norm_stderr\": 0.028959755196824855\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.26143790849673204,\n \"acc_stderr\": 0.017776947157528037,\n \
\ \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.017776947157528037\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n\
\ \"acc_stderr\": 0.04013964554072773,\n \"acc_norm\": 0.22727272727272727,\n\
\ \"acc_norm_stderr\": 0.04013964554072773\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.025000256039546212,\n\
\ \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.025000256039546212\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916718,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916718\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.27710843373493976,\n\
\ \"acc_stderr\": 0.03484331592680587,\n \"acc_norm\": 0.27710843373493976,\n\
\ \"acc_norm_stderr\": 0.03484331592680587\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.19883040935672514,\n \"acc_stderr\": 0.03061111655743253,\n\
\ \"acc_norm\": 0.19883040935672514,\n \"acc_norm_stderr\": 0.03061111655743253\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22399020807833536,\n\
\ \"mc1_stderr\": 0.014594964329474203,\n \"mc2\": 0.3689337478412581,\n\
\ \"mc2_stderr\": 0.014347571303045535\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5359116022099447,\n \"acc_stderr\": 0.014016193433958312\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02350265352539803,\n \
\ \"acc_stderr\": 0.004172883669643974\n }\n}\n```"
repo_url: https://huggingface.co/DatPySci/pythia-1b-spin-iter1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|arc:challenge|25_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|gsm8k|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hellaswag|10_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T07-42-33.395938.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T07-42-33.395938.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- '**/details_harness|winogrande|5_2024-02-18T07-42-33.395938.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-18T07-42-33.395938.parquet'
- config_name: results
data_files:
- split: 2024_02_18T07_42_33.395938
path:
- results_2024-02-18T07-42-33.395938.parquet
- split: latest
path:
- results_2024-02-18T07-42-33.395938.parquet
---
# Dataset Card for Evaluation run of DatPySci/pythia-1b-spin-iter1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DatPySci/pythia-1b-spin-iter1](https://huggingface.co/DatPySci/pythia-1b-spin-iter1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DatPySci__pythia-1b-spin-iter1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-18T07:42:33.395938](https://huggingface.co/datasets/open-llm-leaderboard/details_DatPySci__pythia-1b-spin-iter1/blob/main/results_2024-02-18T07-42-33.395938.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24877589085779003,
"acc_stderr": 0.030472375374358776,
"acc_norm": 0.24988280928761106,
"acc_norm_stderr": 0.031204850855553568,
"mc1": 0.22399020807833536,
"mc1_stderr": 0.014594964329474203,
"mc2": 0.3689337478412581,
"mc2_stderr": 0.014347571303045535
},
"harness|arc:challenge|25": {
"acc": 0.27986348122866894,
"acc_stderr": 0.01311904089772592,
"acc_norm": 0.3054607508532423,
"acc_norm_stderr": 0.013460080478002505
},
"harness|hellaswag|10": {
"acc": 0.3910575582553276,
"acc_stderr": 0.004869899297734552,
"acc_norm": 0.49263095000995816,
"acc_norm_stderr": 0.004989239462835215
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322716,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322716
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.037498507091740206,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.037498507091740206
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.15789473684210525,
"acc_stderr": 0.029674167520101442,
"acc_norm": 0.15789473684210525,
"acc_norm_stderr": 0.029674167520101442
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27169811320754716,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.27169811320754716,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.033450369167889925,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.033450369167889925
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.251063829787234,
"acc_stderr": 0.02834696377716246,
"acc_norm": 0.251063829787234,
"acc_norm_stderr": 0.02834696377716246
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0383515395439942,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0383515395439942
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.20689655172413793,
"acc_stderr": 0.03375672449560554,
"acc_norm": 0.20689655172413793,
"acc_norm_stderr": 0.03375672449560554
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.03932537680392871,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.03932537680392871
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2645161290322581,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.2645161290322581,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2019704433497537,
"acc_stderr": 0.028247350122180277,
"acc_norm": 0.2019704433497537,
"acc_norm_stderr": 0.028247350122180277
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.25906735751295334,
"acc_stderr": 0.03161877917935411,
"acc_norm": 0.25906735751295334,
"acc_norm_stderr": 0.03161877917935411
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.22564102564102564,
"acc_stderr": 0.021193632525148522,
"acc_norm": 0.22564102564102564,
"acc_norm_stderr": 0.021193632525148522
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.02813325257881563,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.02813325257881563
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2184873949579832,
"acc_stderr": 0.02684151432295895,
"acc_norm": 0.2184873949579832,
"acc_norm_stderr": 0.02684151432295895
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2119205298013245,
"acc_stderr": 0.03336767086567977,
"acc_norm": 0.2119205298013245,
"acc_norm_stderr": 0.03336767086567977
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3100917431192661,
"acc_stderr": 0.019830849684439756,
"acc_norm": 0.3100917431192661,
"acc_norm_stderr": 0.019830849684439756
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.25462962962962965,
"acc_stderr": 0.029711275860005368,
"acc_norm": 0.25462962962962965,
"acc_norm_stderr": 0.029711275860005368
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.03019028245350194,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.03019028245350194
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.24472573839662448,
"acc_stderr": 0.027985699387036416,
"acc_norm": 0.24472573839662448,
"acc_norm_stderr": 0.027985699387036416
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.33183856502242154,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.33183856502242154,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.20610687022900764,
"acc_stderr": 0.035477710041594626,
"acc_norm": 0.20610687022900764,
"acc_norm_stderr": 0.035477710041594626
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.038968789850704164,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.038968789850704164
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285713,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285713
},
"harness|hendrycksTest-management|5": {
"acc": 0.22330097087378642,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.22330097087378642,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.029343114798094472,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.029343114798094472
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2707535121328225,
"acc_stderr": 0.015889888362560486,
"acc_norm": 0.2707535121328225,
"acc_norm_stderr": 0.015889888362560486
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.023618678310069363,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.023618678310069363
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24804469273743016,
"acc_stderr": 0.014444157808261441,
"acc_norm": 0.24804469273743016,
"acc_norm_stderr": 0.014444157808261441
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.21895424836601307,
"acc_stderr": 0.02367908986180772,
"acc_norm": 0.21895424836601307,
"acc_norm_stderr": 0.02367908986180772
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.26688102893890675,
"acc_stderr": 0.02512263760881665,
"acc_norm": 0.26688102893890675,
"acc_norm_stderr": 0.02512263760881665
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2716049382716049,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.2716049382716049,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2375886524822695,
"acc_stderr": 0.025389512552729903,
"acc_norm": 0.2375886524822695,
"acc_norm_stderr": 0.025389512552729903
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2392438070404172,
"acc_stderr": 0.010896123652676651,
"acc_norm": 0.2392438070404172,
"acc_norm_stderr": 0.010896123652676651
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3492647058823529,
"acc_stderr": 0.028959755196824855,
"acc_norm": 0.3492647058823529,
"acc_norm_stderr": 0.028959755196824855
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.017776947157528037,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.017776947157528037
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072773,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072773
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.025000256039546212,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.025000256039546212
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916718,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916718
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-virology|5": {
"acc": 0.27710843373493976,
"acc_stderr": 0.03484331592680587,
"acc_norm": 0.27710843373493976,
"acc_norm_stderr": 0.03484331592680587
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.19883040935672514,
"acc_stderr": 0.03061111655743253,
"acc_norm": 0.19883040935672514,
"acc_norm_stderr": 0.03061111655743253
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22399020807833536,
"mc1_stderr": 0.014594964329474203,
"mc2": 0.3689337478412581,
"mc2_stderr": 0.014347571303045535
},
"harness|winogrande|5": {
"acc": 0.5359116022099447,
"acc_stderr": 0.014016193433958312
},
"harness|gsm8k|5": {
"acc": 0.02350265352539803,
"acc_stderr": 0.004172883669643974
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
0-hero/Matter-0.1-Slim-B | ---
license: apache-2.0
---
Subset B of [Matter-0.1](https://huggingface.co/datasets/0-hero/Matter-0.1) <br>
Datasets have been deduped, decontaminated with the [bagel script from Jon Durbin](https://github.com/jondurbin/bagel/blob/main/bagel/data_sources/__init__.py) |
yuhsinchan/nmsqa_seg-dev_test | ---
dataset_info:
features:
- name: case_id
dtype: string
- name: context_code
sequence: int16
- name: context_cnt
sequence: int16
- name: question_code
sequence: int16
- name: question_cnt
sequence: int16
- name: start_idx
dtype: int64
- name: end_idx
dtype: int64
- name: start_time
dtype: float64
- name: end_time
dtype: float64
splits:
- name: dev
num_bytes: 32879888
num_examples: 17155
- name: test
num_bytes: 455624
num_examples: 267
download_size: 9191201
dataset_size: 33335512
---
# Dataset Card for "nmsqa_seg-dev_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ibivibiv/alpaca_lamini5 | ---
dataset_info:
features:
- name: output
dtype: string
- name: instruction
dtype: string
- name: input
dtype: string
splits:
- name: train
num_bytes: 56132706
num_examples: 129280
download_size: 36239071
dataset_size: 56132706
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
PolyAI/minds14 | ---
annotations_creators:
- expert-generated
- crowdsourced
- machine-generated
language_creators:
- crowdsourced
- expert-generated
language:
- en
- fr
- it
- es
- pt
- de
- nl
- ru
- pl
- cs
- ko
- zh
language_bcp47:
- en
- en-GB
- en-US
- en-AU
- fr
- it
- es
- pt
- de
- nl
- ru
- pl
- cs
- ko
- zh
license:
- cc-by-4.0
multilinguality:
- multilingual
pretty_name: 'MInDS-14'
size_categories:
- 10K<n<100K
task_categories:
- automatic-speech-recognition
- speech-processing
task_ids:
- speech-recognition
- keyword-spotting
---
# MInDS-14
## Dataset Description
- **Fine-Tuning script:** [pytorch/audio-classification](https://github.com/huggingface/transformers/tree/main/examples/pytorch/audio-classification)
- **Paper:** [Multilingual and Cross-Lingual Intent Detection from Spoken Data](https://arxiv.org/abs/2104.08524)
- **Total amount of disk used:** ca. 500 MB
MINDS-14 is training and evaluation resource for intent detection task with spoken data. It covers 14
intents extracted from a commercial system in the e-banking domain, associated with spoken examples in 14 diverse language varieties.
## Example
MInDS-14 can be downloaded and used as follows:
```py
from datasets import load_dataset
minds_14 = load_dataset("PolyAI/minds14", "fr-FR") # for French
# to download all data for multi-lingual fine-tuning uncomment following line
# minds_14 = load_dataset("PolyAI/all", "all")
# see structure
print(minds_14)
# load audio sample on the fly
audio_input = minds_14["train"][0]["audio"] # first decoded audio sample
intent_class = minds_14["train"][0]["intent_class"] # first transcription
intent = minds_14["train"].features["intent_class"].names[intent_class]
# use audio_input and language_class to fine-tune your model for audio classification
```
## Dataset Structure
We show detailed information the example configurations `fr-FR` of the dataset.
All other configurations have the same structure.
### Data Instances
**fr-FR**
- Size of downloaded dataset files: 471 MB
- Size of the generated dataset: 300 KB
- Total amount of disk used: 471 MB
An example of a datainstance of the config `fr-FR` looks as follows:
```
{
"path": "/home/patrick/.cache/huggingface/datasets/downloads/extracted/3ebe2265b2f102203be5e64fa8e533e0c6742e72268772c8ac1834c5a1a921e3/fr-FR~ADDRESS/response_4.wav",
"audio": {
"path": "/home/patrick/.cache/huggingface/datasets/downloads/extracted/3ebe2265b2f102203be5e64fa8e533e0c6742e72268772c8ac1834c5a1a921e3/fr-FR~ADDRESS/response_4.wav",
"array": array(
[0.0, 0.0, 0.0, ..., 0.0, 0.00048828, -0.00024414], dtype=float32
),
"sampling_rate": 8000,
},
"transcription": "je souhaite changer mon adresse",
"english_transcription": "I want to change my address",
"intent_class": 1,
"lang_id": 6,
}
```
### Data Fields
The data fields are the same among all splits.
- **path** (str): Path to the audio file
- **audio** (dict): Audio object including loaded audio array, sampling rate and path ot audio
- **transcription** (str): Transcription of the audio file
- **english_transcription** (str): English transcription of the audio file
- **intent_class** (int): Class id of intent
- **lang_id** (int): Id of language
### Data Splits
Every config only has the `"train"` split containing of *ca.* 600 examples.
## Dataset Creation
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
All datasets are licensed under the [Creative Commons license (CC-BY)](https://creativecommons.org/licenses/).
### Citation Information
```
@article{DBLP:journals/corr/abs-2104-08524,
author = {Daniela Gerz and
Pei{-}Hao Su and
Razvan Kusztos and
Avishek Mondal and
Michal Lis and
Eshan Singhal and
Nikola Mrksic and
Tsung{-}Hsien Wen and
Ivan Vulic},
title = {Multilingual and Cross-Lingual Intent Detection from Spoken Data},
journal = {CoRR},
volume = {abs/2104.08524},
year = {2021},
url = {https://arxiv.org/abs/2104.08524},
eprinttype = {arXiv},
eprint = {2104.08524},
timestamp = {Mon, 26 Apr 2021 17:25:10 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-2104-08524.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
### Contributions
Thanks to [@patrickvonplaten](https://github.com/patrickvonplaten) for adding this dataset
|
irds/codec_history | ---
pretty_name: '`codec/history`'
viewer: false
source_datasets: ['irds/codec']
task_categories:
- text-retrieval
---
# Dataset Card for `codec/history`
The `codec/history` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/codec#codec/history).
# Data
This dataset provides:
- `queries` (i.e., topics); count=14
- `qrels`: (relevance assessments); count=2,024
- For `docs`, use [`irds/codec`](https://huggingface.co/datasets/irds/codec)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/codec_history', 'queries')
for record in queries:
record # {'query_id': ..., 'query': ..., 'domain': ..., 'guidelines': ...}
qrels = load_dataset('irds/codec_history', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in ๐ค Dataset format.
## Citation Information
```
@inproceedings{mackie2022codec,
title={CODEC: Complex Document and Entity Collection},
author={Mackie, Iain and Owoicho, Paul and Gemmell, Carlos and Fischer, Sophie and MacAvaney, Sean and Dalton, Jeffery},
booktitle={Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval},
year={2022}
}
```
|
Nexdata/Chinese_Digital_Speech_Data_by_Mobile_Phone | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Chines_Digital_Speech_Data_by_Mobile_Phone
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/1072?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
11,010 People - Chines Digital Speech Data by Mobile Phone was recorded by 11,010 voice recorders in Mandarin. It's collected from each person 30 sentences of 4-8 digits.
For more details, please refer to the link: https://www.nexdata.ai/datasets/1072?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
Chinese
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions
|
stingning/ultrachat | ---
license: mit
task_categories:
- conversational
- text-generation
language:
- en
size_categories:
- 1M<n<10M
pretty_name: UltraChat
---
# Dataset Card for Dataset Name
## Dataset Description
An open-source, large-scale, and multi-round dialogue data powered by Turbo APIs. In consideration of factors such as safeguarding privacy, **we do not directly use any data available on the Internet as prompts**.
To ensure generation quality, two separate ChatGPT Turbo APIs are adopted in generation, where one plays the role of the user to generate queries and the other generates the response.
We instruct the user model with carefully designed prompts to mimic human user behavior and call the two APIs iteratively. The generated dialogues undergo further post-processing and filtering.
ULtraChat is composed of three sectors:
- ๐ **Questions about the World**: The dialogue data in this sector is derived from a wide range of inquiries related to concepts, entities, and objects from the real world. The topics covered are extensive, spanning areas such as technology, art, and entrepreneurship.
- โ๐ป **Writing and Creation**: The dialogue data in this sector is driven by the demands for writing/creation from scratch, and encompasses any tasks that an AI assistant may aid within the creative process, spanning from email composition to crafting narratives and plays, and beyond.
- ๐ **Assistance on Existent Materials**: The dialogue data in this sector is generated based on existing materials, including but not limited to rewriting, continuation, summarization, and inference, covering a diverse range of topics.
- Repository: [UltraChat](https://github.com/thunlp/UltraChat)
- Explorer: [plain-explorer](http://39.101.77.220/), [Nomic-AI-Atlas-Explorer](https://atlas.nomic.ai/map/0ce65783-c3a9-40b5-895d-384933f50081/a7b46301-022f-45d8-bbf4-98107eabdbac)
## Dataset Structure
Each line in the downloaded data file is a json dict containing the data id and dialogue data in a list format. Below is an example line.
```
{
"id": "0",
"data": [
"How can cross training benefit groups like runners, swimmers, or weightlifters?",
"Cross training can benefit groups like runners, swimmers, or weightlifters in the following ways: ...",
"That makes sense. I've been wanting to improve my running time, but I never thought about incorporating strength training. Do you have any recommendations for specific exercises?",
"Sure, here are some strength training exercises that can benefit runners: ...",
"Hmm, I'm not really a fan of weightlifting though. Can I incorporate other forms of exercise into my routine to improve my running time?",
"Yes, absolutely! ...",
"..."
]
}
```
### Citation Information
```bibtex
@article{ding2023enhancing,
title={Enhancing Chat Language Models by Scaling High-quality Instructional Conversations},
author={Ding, Ning and Chen, Yulin and Xu, Bokai and Qin, Yujia and Zheng, Zhi and Hu, Shengding and Liu, Zhiyuan and Sun, Maosong and Zhou, Bowen},
journal={arXiv preprint arXiv:2305.14233},
year={2023}
}
``` |
severo/glue | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- acceptability-classification
- natural-language-inference
- semantic-similarity-scoring
- sentiment-classification
- text-scoring
paperswithcode_id: glue
pretty_name: GLUE (General Language Understanding Evaluation benchmark)
train-eval-index:
- config: cola
task: text-classification
task_id: binary_classification
splits:
train_split: train
eval_split: validation
col_mapping:
sentence: text
label: target
- config: sst2
task: text-classification
task_id: binary_classification
splits:
train_split: train
eval_split: validation
col_mapping:
sentence: text
label: target
- config: mrpc
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation
col_mapping:
sentence1: text1
sentence2: text2
label: target
- config: qqp
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation
col_mapping:
question1: text1
question2: text2
label: target
- config: stsb
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation
col_mapping:
sentence1: text1
sentence2: text2
label: target
- config: mnli
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation_matched
col_mapping:
premise: text1
hypothesis: text2
label: target
- config: mnli_mismatched
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation
col_mapping:
premise: text1
hypothesis: text2
label: target
- config: mnli_matched
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation
col_mapping:
premise: text1
hypothesis: text2
label: target
- config: qnli
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation
col_mapping:
question: text1
sentence: text2
label: target
- config: rte
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation
col_mapping:
sentence1: text1
sentence2: text2
label: target
- config: wnli
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation
col_mapping:
sentence1: text1
sentence2: text2
label: target
configs:
- ax
- cola
- mnli
- mnli_matched
- mnli_mismatched
- mrpc
- qnli
- qqp
- rte
- sst2
- stsb
- wnli
tags:
- qa-nli
- coreference-nli
- paraphrase-identification
---
# Dataset Card for GLUE
## Table of Contents
- [Dataset Card for GLUE](#dataset-card-for-glue)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [ax](#ax)
- [cola](#cola)
- [mnli](#mnli)
- [mnli_matched](#mnli_matched)
- [mnli_mismatched](#mnli_mismatched)
- [mrpc](#mrpc)
- [qnli](#qnli)
- [qqp](#qqp)
- [rte](#rte)
- [sst2](#sst2)
- [stsb](#stsb)
- [wnli](#wnli)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [ax](#ax-1)
- [cola](#cola-1)
- [mnli](#mnli-1)
- [mnli_matched](#mnli_matched-1)
- [mnli_mismatched](#mnli_mismatched-1)
- [mrpc](#mrpc-1)
- [qnli](#qnli-1)
- [qqp](#qqp-1)
- [rte](#rte-1)
- [sst2](#sst2-1)
- [stsb](#stsb-1)
- [wnli](#wnli-1)
- [Data Fields](#data-fields)
- [ax](#ax-2)
- [cola](#cola-2)
- [mnli](#mnli-2)
- [mnli_matched](#mnli_matched-2)
- [mnli_mismatched](#mnli_mismatched-2)
- [mrpc](#mrpc-2)
- [qnli](#qnli-2)
- [qqp](#qqp-2)
- [rte](#rte-2)
- [sst2](#sst2-2)
- [stsb](#stsb-2)
- [wnli](#wnli-2)
- [Data Splits](#data-splits)
- [ax](#ax-3)
- [cola](#cola-3)
- [mnli](#mnli-3)
- [mnli_matched](#mnli_matched-3)
- [mnli_mismatched](#mnli_mismatched-3)
- [mrpc](#mrpc-3)
- [qnli](#qnli-3)
- [qqp](#qqp-3)
- [rte](#rte-3)
- [sst2](#sst2-3)
- [stsb](#stsb-3)
- [wnli](#wnli-3)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://nyu-mll.github.io/CoLA/](https://nyu-mll.github.io/CoLA/)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 955.33 MB
- **Size of the generated dataset:** 229.68 MB
- **Total amount of disk used:** 1185.01 MB
### Dataset Summary
GLUE, the General Language Understanding Evaluation benchmark (https://gluebenchmark.com/) is a collection of resources for training, evaluating, and analyzing natural language understanding systems.
### Supported Tasks and Leaderboards
The leaderboard for the GLUE benchmark can be found [at this address](https://gluebenchmark.com/). It comprises the following tasks:
#### ax
A manually-curated evaluation dataset for fine-grained analysis of system performance on a broad range of linguistic phenomena. This dataset evaluates sentence understanding through Natural Language Inference (NLI) problems. Use a model trained on MulitNLI to produce predictions for this dataset.
#### cola
The Corpus of Linguistic Acceptability consists of English acceptability judgments drawn from books and journal articles on linguistic theory. Each example is a sequence of words annotated with whether it is a grammatical English sentence.
#### mnli
The Multi-Genre Natural Language Inference Corpus is a crowdsourced collection of sentence pairs with textual entailment annotations. Given a premise sentence and a hypothesis sentence, the task is to predict whether the premise entails the hypothesis (entailment), contradicts the hypothesis (contradiction), or neither (neutral). The premise sentences are gathered from ten different sources, including transcribed speech, fiction, and government reports. The authors of the benchmark use the standard test set, for which they obtained private labels from the RTE authors, and evaluate on both the matched (in-domain) and mismatched (cross-domain) section. They also uses and recommend the SNLI corpus as 550k examples of auxiliary training data.
#### mnli_matched
The matched validation and test splits from MNLI. See the "mnli" BuilderConfig for additional information.
#### mnli_mismatched
The mismatched validation and test splits from MNLI. See the "mnli" BuilderConfig for additional information.
#### mrpc
The Microsoft Research Paraphrase Corpus (Dolan & Brockett, 2005) is a corpus of sentence pairs automatically extracted from online news sources, with human annotations for whether the sentences in the pair are semantically equivalent.
#### qnli
The Stanford Question Answering Dataset is a question-answering dataset consisting of question-paragraph pairs, where one of the sentences in the paragraph (drawn from Wikipedia) contains the answer to the corresponding question (written by an annotator). The authors of the benchmark convert the task into sentence pair classification by forming a pair between each question and each sentence in the corresponding context, and filtering out pairs with low lexical overlap between the question and the context sentence. The task is to determine whether the context sentence contains the answer to the question. This modified version of the original task removes the requirement that the model select the exact answer, but also removes the simplifying assumptions that the answer is always present in the input and that lexical overlap is a reliable cue.
#### qqp
The Quora Question Pairs2 dataset is a collection of question pairs from the community question-answering website Quora. The task is to determine whether a pair of questions are semantically equivalent.
#### rte
The Recognizing Textual Entailment (RTE) datasets come from a series of annual textual entailment challenges. The authors of the benchmark combined the data from RTE1 (Dagan et al., 2006), RTE2 (Bar Haim et al., 2006), RTE3 (Giampiccolo et al., 2007), and RTE5 (Bentivogli et al., 2009). Examples are constructed based on news and Wikipedia text. The authors of the benchmark convert all datasets to a two-class split, where for three-class datasets they collapse neutral and contradiction into not entailment, for consistency.
#### sst2
The Stanford Sentiment Treebank consists of sentences from movie reviews and human annotations of their sentiment. The task is to predict the sentiment of a given sentence. It uses the two-way (positive/negative) class split, with only sentence-level labels.
#### stsb
The Semantic Textual Similarity Benchmark (Cer et al., 2017) is a collection of sentence pairs drawn from news headlines, video and image captions, and natural language inference data. Each pair is human-annotated with a similarity score from 1 to 5.
#### wnli
The Winograd Schema Challenge (Levesque et al., 2011) is a reading comprehension task in which a system must read a sentence with a pronoun and select the referent of that pronoun from a list of choices. The examples are manually constructed to foil simple statistical methods: Each one is contingent on contextual information provided by a single word or phrase in the sentence. To convert the problem into sentence pair classification, the authors of the benchmark construct sentence pairs by replacing the ambiguous pronoun with each possible referent. The task is to predict if the sentence with the pronoun substituted is entailed by the original sentence. They use a small evaluation set consisting of new examples derived from fiction books that was shared privately by the authors of the original corpus. While the included training set is balanced between two classes, the test set is imbalanced between them (65% not entailment). Also, due to a data quirk, the development set is adversarial: hypotheses are sometimes shared between training and development examples, so if a model memorizes the training examples, they will predict the wrong label on corresponding development set example. As with QNLI, each example is evaluated separately, so there is not a systematic correspondence between a model's score on this task and its score on the unconverted original task. The authors of the benchmark call converted dataset WNLI (Winograd NLI).
### Languages
The language data in GLUE is in English (BCP-47 `en`)
## Dataset Structure
### Data Instances
#### ax
- **Size of downloaded dataset files:** 0.21 MB
- **Size of the generated dataset:** 0.23 MB
- **Total amount of disk used:** 0.44 MB
An example of 'test' looks as follows.
```
{
"premise": "The cat sat on the mat.",
"hypothesis": "The cat did not sit on the mat.",
"label": -1,
"idx: 0
}
```
#### cola
- **Size of downloaded dataset files:** 0.36 MB
- **Size of the generated dataset:** 0.58 MB
- **Total amount of disk used:** 0.94 MB
An example of 'train' looks as follows.
```
{
"sentence": "Our friends won't buy this analysis, let alone the next one we propose.",
"label": 1,
"id": 0
}
```
#### mnli
- **Size of downloaded dataset files:** 298.29 MB
- **Size of the generated dataset:** 78.65 MB
- **Total amount of disk used:** 376.95 MB
An example of 'train' looks as follows.
```
{
"premise": "Conceptually cream skimming has two basic dimensions - product and geography.",
"hypothesis": "Product and geography are what make cream skimming work.",
"label": 1,
"idx": 0
}
```
#### mnli_matched
- **Size of downloaded dataset files:** 298.29 MB
- **Size of the generated dataset:** 3.52 MB
- **Total amount of disk used:** 301.82 MB
An example of 'test' looks as follows.
```
{
"premise": "Hierbas, ans seco, ans dulce, and frigola are just a few names worth keeping a look-out for.",
"hypothesis": "Hierbas is a name worth looking out for.",
"label": -1,
"idx": 0
}
```
#### mnli_mismatched
- **Size of downloaded dataset files:** 298.29 MB
- **Size of the generated dataset:** 3.73 MB
- **Total amount of disk used:** 302.02 MB
An example of 'test' looks as follows.
```
{
"premise": "What have you decided, what are you going to do?",
"hypothesis": "So what's your decision?,
"label": -1,
"idx": 0
}
```
#### mrpc
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### qnli
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### qqp
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### rte
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### sst2
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### stsb
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### wnli
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Data Fields
The data fields are the same among all splits.
#### ax
- `premise`: a `string` feature.
- `hypothesis`: a `string` feature.
- `label`: a classification label, with possible values including `entailment` (0), `neutral` (1), `contradiction` (2).
- `idx`: a `int32` feature.
#### cola
- `sentence`: a `string` feature.
- `label`: a classification label, with possible values including `unacceptable` (0), `acceptable` (1).
- `idx`: a `int32` feature.
#### mnli
- `premise`: a `string` feature.
- `hypothesis`: a `string` feature.
- `label`: a classification label, with possible values including `entailment` (0), `neutral` (1), `contradiction` (2).
- `idx`: a `int32` feature.
#### mnli_matched
- `premise`: a `string` feature.
- `hypothesis`: a `string` feature.
- `label`: a classification label, with possible values including `entailment` (0), `neutral` (1), `contradiction` (2).
- `idx`: a `int32` feature.
#### mnli_mismatched
- `premise`: a `string` feature.
- `hypothesis`: a `string` feature.
- `label`: a classification label, with possible values including `entailment` (0), `neutral` (1), `contradiction` (2).
- `idx`: a `int32` feature.
#### mrpc
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### qnli
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### qqp
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### rte
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### sst2
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### stsb
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### wnli
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Data Splits
#### ax
| |test|
|---|---:|
|ax |1104|
#### cola
| |train|validation|test|
|----|----:|---------:|---:|
|cola| 8551| 1043|1063|
#### mnli
| |train |validation_matched|validation_mismatched|test_matched|test_mismatched|
|----|-----:|-----------------:|--------------------:|-----------:|--------------:|
|mnli|392702| 9815| 9832| 9796| 9847|
#### mnli_matched
| |validation|test|
|------------|---------:|---:|
|mnli_matched| 9815|9796|
#### mnli_mismatched
| |validation|test|
|---------------|---------:|---:|
|mnli_mismatched| 9832|9847|
#### mrpc
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### qnli
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### qqp
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### rte
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### sst2
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### stsb
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### wnli
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@article{warstadt2018neural,
title={Neural Network Acceptability Judgments},
author={Warstadt, Alex and Singh, Amanpreet and Bowman, Samuel R},
journal={arXiv preprint arXiv:1805.12471},
year={2018}
}
@inproceedings{wang2019glue,
title={{GLUE}: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding},
author={Wang, Alex and Singh, Amanpreet and Michael, Julian and Hill, Felix and Levy, Omer and Bowman, Samuel R.},
note={In the Proceedings of ICLR.},
year={2019}
}
Note that each GLUE dataset has its own citation. Please see the source to see
the correct citation for each contained dataset.
```
### Contributions
Thanks to [@patpizio](https://github.com/patpizio), [@jeswan](https://github.com/jeswan), [@thomwolf](https://github.com/thomwolf), [@patrickvonplaten](https://github.com/patrickvonplaten), [@mariamabarham](https://github.com/mariamabarham) for adding this dataset. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.