datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
autoevaluate/autoeval-eval-xsum-default-01da82-33500145018 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xsum
eval_info:
task: summarization
model: t5-small
metrics: []
dataset_name: xsum
dataset_config: default
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: t5-small
* Dataset: xsum
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@dfantasy](https://huggingface.co/dfantasy) for evaluating this model. |
CyberHarem/koyanskaya_of_dark_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of koyanskaya_of_dark/闇のコヤンスカヤ/暗之高扬斯卡娅 (Fate/Grand Order)
This is the dataset of koyanskaya_of_dark/闇のコヤンスカヤ/暗之高扬斯卡娅 (Fate/Grand Order), containing 500 images and their tags.
The core tags of this character are `pink_hair, long_hair, animal_ears, breasts, yellow_eyes, animal_ear_fluff, large_breasts, sidelocks, hair_between_eyes, fox_ears, fox_tail, glasses, tail, fox_girl, bow, hair_bow, ponytail, pink_bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 816.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/koyanskaya_of_dark_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 700.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/koyanskaya_of_dark_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1281 | 1.36 GiB | [Download](https://huggingface.co/datasets/CyberHarem/koyanskaya_of_dark_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/koyanskaya_of_dark_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, black_bodysuit, center_opening, choker, cleavage, hip_vent, looking_at_viewer, smile, solo, blush, thighs, collarbone, open_mouth |
| 1 | 11 |  |  |  |  |  | 1boy, 1girl, black_bodysuit, blush, hetero, penis, nipples, vaginal, center_opening, hip_vent, open_mouth, thighs, mosaic_censoring, solo_focus, choker, smile, spread_legs, looking_at_viewer, navel, clothed_sex, collarbone, cum_in_pussy, tongue_out |
| 2 | 14 |  |  |  |  |  | 1girl, bare_shoulders, choker, looking_at_viewer, solo, cleavage, off_shoulder, collarbone, smile, thighs, wide_sleeves, long_sleeves, black_headwear, top_hat, very_long_hair, white_gloves, thighhighs, black_skirt, holding, kimono, open_mouth, red_coat, whip |
| 3 | 9 |  |  |  |  |  | 1girl, bare_shoulders, black_dress, black_gloves, china_dress, double_bun, looking_at_viewer, smile, solo, underboob, folding_fan, holding_fan, tassel, center_opening, jingle_bell, open_mouth, sleeveless_dress |
| 4 | 5 |  |  |  |  |  | 1girl, bare_shoulders, black_dress, center_opening, china_dress, looking_at_viewer, smile, solo, thighs, underboob, black_gloves, blush, double_bun, jingle_bell, sitting, sleeveless_dress, tassel, closed_mouth, side_slit, simple_background, white-framed_eyewear, white_background, open_mouth |
| 5 | 8 |  |  |  |  |  | 1boy, 1girl, blush, hetero, penis, solo_focus, black_gloves, long_sleeves, twintails, fellatio, looking_at_viewer, mosaic_censoring, nipples, rabbit_ears, erection, pov, white_shirt, :>=, male_pubic_hair, black_bowtie, breasts_out, collared_shirt, cum, dress_shirt, open_clothes |
| 6 | 67 |  |  |  |  |  | black_bow, 1girl, rabbit_ears, smile, long_sleeves, looking_at_viewer, twintails, white_shirt, solo, collared_shirt, dress_shirt, underbust, black_gloves, corset, blush, white_pantyhose, coattails, thighs, cloak, leotard, open_mouth, playboy_bunny |
| 7 | 6 |  |  |  |  |  | bare_shoulders, cleavage, black_one-piece_swimsuit, blue_sky, blush, casual_one-piece_swimsuit, looking_at_viewer, thighs, 2girls, highleg_swimsuit, smile, choker, covered_navel, day, grey-framed_eyewear |
| 8 | 5 |  |  |  |  |  | cleavage, hair_ribbon, 1girl, cat_paws, looking_at_viewer, neck_bell, paw_gloves, solo, bare_shoulders, blue_ribbon, detached_sleeves, jingle_bell, red_ribbon, blue_kimono, collarbone, fangs, grey_background, open_mouth, red_kimono, simple_background, smile |
| 9 | 37 |  |  |  |  |  | looking_at_viewer, very_long_hair, 1girl, double_bun, hat, long_sleeves, white_headwear, rabbit_ears, smile, white_dress, detached_collar, pink_gloves, white_coat, cleavage, double-breasted, wide_sleeves, open_coat, solo, short_dress, blush, thighs, white_thighhighs, garter_straps, thigh_boots, open_mouth |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_bodysuit | center_opening | choker | cleavage | hip_vent | looking_at_viewer | smile | solo | blush | thighs | collarbone | open_mouth | 1boy | hetero | penis | nipples | vaginal | mosaic_censoring | solo_focus | spread_legs | navel | clothed_sex | cum_in_pussy | tongue_out | bare_shoulders | off_shoulder | wide_sleeves | long_sleeves | black_headwear | top_hat | very_long_hair | white_gloves | thighhighs | black_skirt | holding | kimono | red_coat | whip | black_dress | black_gloves | china_dress | double_bun | underboob | folding_fan | holding_fan | tassel | jingle_bell | sleeveless_dress | sitting | closed_mouth | side_slit | simple_background | white-framed_eyewear | white_background | twintails | fellatio | rabbit_ears | erection | pov | white_shirt | :>= | male_pubic_hair | black_bowtie | breasts_out | collared_shirt | cum | dress_shirt | open_clothes | black_bow | underbust | corset | white_pantyhose | coattails | cloak | leotard | playboy_bunny | black_one-piece_swimsuit | blue_sky | casual_one-piece_swimsuit | 2girls | highleg_swimsuit | covered_navel | day | grey-framed_eyewear | hair_ribbon | cat_paws | neck_bell | paw_gloves | blue_ribbon | detached_sleeves | red_ribbon | blue_kimono | fangs | grey_background | red_kimono | hat | white_headwear | white_dress | detached_collar | pink_gloves | white_coat | double-breasted | open_coat | short_dress | white_thighhighs | garter_straps | thigh_boots |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-----------------|:---------|:-----------|:-----------|:--------------------|:--------|:-------|:--------|:---------|:-------------|:-------------|:-------|:---------|:--------|:----------|:----------|:-------------------|:-------------|:--------------|:--------|:--------------|:---------------|:-------------|:-----------------|:---------------|:---------------|:---------------|:-----------------|:----------|:-----------------|:---------------|:-------------|:--------------|:----------|:---------|:-----------|:-------|:--------------|:---------------|:--------------|:-------------|:------------|:--------------|:--------------|:---------|:--------------|:-------------------|:----------|:---------------|:------------|:--------------------|:-----------------------|:-------------------|:------------|:-----------|:--------------|:-----------|:------|:--------------|:------|:------------------|:---------------|:--------------|:-----------------|:------|:--------------|:---------------|:------------|:------------|:---------|:------------------|:------------|:--------|:----------|:----------------|:---------------------------|:-----------|:----------------------------|:---------|:-------------------|:----------------|:------|:----------------------|:--------------|:-----------|:------------|:-------------|:--------------|:-------------------|:-------------|:--------------|:--------|:------------------|:-------------|:------|:-----------------|:--------------|:------------------|:--------------|:-------------|:------------------|:------------|:--------------|:-------------------|:----------------|:--------------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | X | X | | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 14 |  |  |  |  |  | X | | | X | X | | X | X | X | | X | X | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 9 |  |  |  |  |  | X | | X | | | | X | X | X | | | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | X | | | | X | X | X | X | X | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | | | | | | X | | | X | | | | X | X | X | X | | X | X | | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 67 |  |  |  |  |  | X | | | | | | X | X | X | X | X | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | | X | | | X | | | | | X | | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | | | | X | X | | X | X | | X | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 5 |  |  |  |  |  | X | | | | X | | X | X | X | | | X | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 9 | 37 |  |  |  |  |  | X | | | | X | | X | X | X | X | X | | X | | | | | | | | | | | | | | | X | X | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
allennghayoui/mistral-chat-code-assistant | ---
dataset_info:
features:
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 215248.14583333334
num_examples: 172
- name: test
num_bytes: 12514.427083333334
num_examples: 10
- name: validation
num_bytes: 12514.427083333334
num_examples: 10
download_size: 87509
dataset_size: 240277.00000000003
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
Atipico1/nq_test | ---
dataset_info:
features:
- name: question
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: test
num_bytes: 12000585
num_examples: 3610
download_size: 7040037
dataset_size: 12000585
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
dipeshshendre/guanaco-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1267307.168
num_examples: 766
download_size: 874822
dataset_size: 1267307.168
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/z23_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of z23/Z23 (Azur Lane)
This is the dataset of z23/Z23 (Azur Lane), containing 500 images and their tags.
The core tags of this character are `short_hair, breasts, bangs, bow, blue_eyes, blonde_hair, hair_bow, purple_eyes, hat, medium_breasts, beret`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 638.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/z23_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 363.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/z23_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1206 | 801.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/z23_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 561.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/z23_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1206 | 1.11 GiB | [Download](https://huggingface.co/datasets/CyberHarem/z23_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/z23_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1boy, 1girl, blush, hetero, sex, vaginal, nipples, penis, solo_focus, navel, open_mouth, spread_legs, looking_at_viewer, cum_in_pussy, cowgirl_position, gloves, iron_cross, mosaic_censoring, sweat, collarbone, completely_nude, large_breasts |
| 1 | 9 |  |  |  |  |  | 1girl, bare_shoulders, black_dress, blush, bridal_veil, iron_cross, looking_at_viewer, sleeveless_dress, solo, wedding_dress, black_gloves, necklace, turtleneck_dress, red_rose, frills, hair_between_eyes, hair_ornament, holding, simple_background, smile, white_background, sidelocks |
| 2 | 18 |  |  |  |  |  | 1girl, bare_shoulders, iron_cross, simple_background, solo, white_background, white_gloves, looking_at_viewer, bike_shorts, blush, hair_between_eyes, ribbon, detached_sleeves, standing, open_mouth, sideboob, black_shorts, double-breasted, sleeveless |
| 3 | 11 |  |  |  |  |  | 1girl, ocean, bare_shoulders, cloud, day, open_mouth, outdoors, iron_cross, looking_at_viewer, smile, solo, blue_sky, blush, see-through, cleavage, thigh_strap, barefoot, black_bikini, choker, hair_between_eyes, navel, ribbon, water, beach |
| 4 | 12 |  |  |  |  |  | blue_shirt, blush, looking_at_viewer, midriff, solo, wrist_cuffs, 1girl, blue_headwear, blue_skirt, crop_top, navel, neck_ribbon, sleeveless_shirt, white_sailor_collar, frills, pleated_skirt, smile, yellow_ribbon, bare_shoulders, collarbone, white_background, hair_between_eyes, simple_background, blue_serafuku, closed_mouth, hand_on_hip, hand_up, anchor_symbol, shorts_under_skirt, standing |
| 5 | 6 |  |  |  |  |  | fake_animal_ears, gloves, iron_cross, rabbit_ears, bare_shoulders, looking_at_viewer, open_mouth, sleeveless, 2girls, blush, hair_between_eyes, ribbon, simple_background, white_background, 3girls, light_brown_hair, long_hair, sidelocks, smile, solo_focus, twintails, white_hair |
| 6 | 11 |  |  |  |  |  | looking_at_viewer, mini_hat, plaid_skirt, 1girl, bare_shoulders, midriff, sleeveless_shirt, solo, tilted_headwear, white_gloves, idol, iron_cross, navel, blush, braid, crop_top, red_bow, smile, standing, black_headwear, black_thighhighs, collared_shirt, elbow_gloves, headset, plaid_bow, bowtie, open_mouth, plaid_headwear, armpits, belt, black_footwear, black_skirt, hair_between_eyes, miniskirt, official_alternate_costume, pleated_skirt, top_hat, white_background, white_shirt, zettai_ryouiki, black_bow, closed_mouth, detached_sleeves, frilled_skirt, full_body, hand_up, shoes |
| 7 | 11 |  |  |  |  |  | bare_shoulders, looking_at_viewer, 1girl, glasses, solo, necktie, sideboob, black_skirt, blush, sleeveless, black_pantyhose, off_shoulder, pencil_skirt, simple_background, white_background, miniskirt, shirt, black_bow, labcoat, smile |
| 8 | 9 |  |  |  |  |  | 1girl, bikini, blush, cow_ears, cow_horns, cow_print, elbow_gloves, fake_animal_ears, bare_shoulders, large_breasts, thighhighs, collarbone, cow_tail, fake_horns, looking_at_viewer, white_gloves, alternate_costume, navel, simple_background, solo, sweat, brown_hair, cleavage, cow_girl, open_mouth |
| 9 | 12 |  |  |  |  |  | 1girl, bikini_armor, large_breasts, red_bikini, navel, bare_shoulders, blush, fingerless_gloves, hair_between_eyes, red_armor, solo, cleavage, headgear, official_alternate_costume, standing, thigh_strap, groin, sidelocks, cowboy_shot, highleg_bikini, looking_at_viewer, armlet, closed_mouth, elbow_gloves, sweat |
| 10 | 9 |  |  |  |  |  | bare_shoulders, blush, 1girl, looking_at_viewer, one_side_up, red_scrunchie, solo, hair_scrunchie, short_shorts, barefoot, black_shorts, cleavage, collarbone, wrist_scrunchie, bare_legs, black_camisole, brown_hair, feet, indoors, midriff, toes, alternate_costume, closed_mouth, full_body, hair_between_eyes, holding_cup, navel, plant, sitting, soles |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1boy | 1girl | blush | hetero | sex | vaginal | nipples | penis | solo_focus | navel | open_mouth | spread_legs | looking_at_viewer | cum_in_pussy | cowgirl_position | gloves | iron_cross | mosaic_censoring | sweat | collarbone | completely_nude | large_breasts | bare_shoulders | black_dress | bridal_veil | sleeveless_dress | solo | wedding_dress | black_gloves | necklace | turtleneck_dress | red_rose | frills | hair_between_eyes | hair_ornament | holding | simple_background | smile | white_background | sidelocks | white_gloves | bike_shorts | ribbon | detached_sleeves | standing | sideboob | black_shorts | double-breasted | sleeveless | ocean | cloud | day | outdoors | blue_sky | see-through | cleavage | thigh_strap | barefoot | black_bikini | choker | water | beach | blue_shirt | midriff | wrist_cuffs | blue_headwear | blue_skirt | crop_top | neck_ribbon | sleeveless_shirt | white_sailor_collar | pleated_skirt | yellow_ribbon | blue_serafuku | closed_mouth | hand_on_hip | hand_up | anchor_symbol | shorts_under_skirt | fake_animal_ears | rabbit_ears | 2girls | 3girls | light_brown_hair | long_hair | twintails | white_hair | mini_hat | plaid_skirt | tilted_headwear | idol | braid | red_bow | black_headwear | black_thighhighs | collared_shirt | elbow_gloves | headset | plaid_bow | bowtie | plaid_headwear | armpits | belt | black_footwear | black_skirt | miniskirt | official_alternate_costume | top_hat | white_shirt | zettai_ryouiki | black_bow | frilled_skirt | full_body | shoes | glasses | necktie | black_pantyhose | off_shoulder | pencil_skirt | shirt | labcoat | bikini | cow_ears | cow_horns | cow_print | thighhighs | cow_tail | fake_horns | alternate_costume | brown_hair | cow_girl | bikini_armor | red_bikini | fingerless_gloves | red_armor | headgear | groin | cowboy_shot | highleg_bikini | armlet | one_side_up | red_scrunchie | hair_scrunchie | short_shorts | wrist_scrunchie | bare_legs | black_camisole | feet | indoors | toes | holding_cup | plant | sitting | soles |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-------|:--------|:--------|:---------|:------|:----------|:----------|:--------|:-------------|:--------|:-------------|:--------------|:--------------------|:---------------|:-------------------|:---------|:-------------|:-------------------|:--------|:-------------|:------------------|:----------------|:-----------------|:--------------|:--------------|:-------------------|:-------|:----------------|:---------------|:-----------|:-------------------|:-----------|:---------|:--------------------|:----------------|:----------|:--------------------|:--------|:-------------------|:------------|:---------------|:--------------|:---------|:-------------------|:-----------|:-----------|:---------------|:------------------|:-------------|:--------|:--------|:------|:-----------|:-----------|:--------------|:-----------|:--------------|:-----------|:---------------|:---------|:--------|:--------|:-------------|:----------|:--------------|:----------------|:-------------|:-----------|:--------------|:-------------------|:----------------------|:----------------|:----------------|:----------------|:---------------|:--------------|:----------|:----------------|:---------------------|:-------------------|:--------------|:---------|:---------|:-------------------|:------------|:------------|:-------------|:-----------|:--------------|:------------------|:-------|:--------|:----------|:-----------------|:-------------------|:-----------------|:---------------|:----------|:------------|:---------|:-----------------|:----------|:-------|:-----------------|:--------------|:------------|:-----------------------------|:----------|:--------------|:-----------------|:------------|:----------------|:------------|:--------|:----------|:----------|:------------------|:---------------|:---------------|:--------|:----------|:---------|:-----------|:------------|:------------|:-------------|:-----------|:-------------|:--------------------|:-------------|:-----------|:---------------|:-------------|:--------------------|:------------|:-----------|:--------|:--------------|:-----------------|:---------|:--------------|:----------------|:-----------------|:---------------|:------------------|:------------|:-----------------|:-------|:----------|:-------|:--------------|:--------|:----------|:--------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | | X | X | | | | | | | | | | X | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 18 |  |  |  |  |  | | X | X | | | | | | | | X | | X | | | | X | | | | | | X | | | | X | | | | | | | X | | | X | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 11 |  |  |  |  |  | | X | X | | | | | | | X | X | | X | | | | X | | | | | | X | | | | X | | | | | | | X | | | | X | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 12 |  |  |  |  |  | | X | X | | | | | | | X | | | X | | | | | | | X | | | X | | | | X | | | | | | X | X | | | X | X | X | | | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | | | X | | | | | | X | | X | | X | | | X | X | | | | | | X | | | | | | | | | | | X | | | X | X | X | X | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 11 |  |  |  |  |  | | X | X | | | | | | | X | X | | X | | | | X | | | | | | X | | | | X | | | | | | | X | | | | X | X | | X | | | X | X | | | | | | | | | | | | | | | | | | | X | | | | X | | X | | X | | | X | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 11 |  |  |  |  |  | | X | X | | | | | | | | | | X | | | | | | | | | | X | | | | X | | | | | | | | | | X | X | X | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | X | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 9 |  |  |  |  |  | | X | X | | | | | | | X | X | | X | | | | | | X | X | | X | X | | | | X | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 12 |  |  |  |  |  | | X | X | | | | | | | X | | | X | | | | | | X | | | X | X | | | | X | | | | | | | X | | | | | | X | | | | | X | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 10 | 9 |  |  |  |  |  | | X | X | | | | | | | X | | | X | | | | | | | X | | | X | | | | X | | | | | | | X | | | | | | | | | | | | | X | | | | | | | | | X | | X | | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
mask-distilled-one-sec-cv12/chunk_258 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1204135792
num_examples: 236476
download_size: 1228105467
dataset_size: 1204135792
---
# Dataset Card for "chunk_258"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
betteracs/thai-receipt-ocr-v3 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: label
dtype: string
splits:
- name: train
num_bytes: 12137410.0
num_examples: 710
download_size: 12133639
dataset_size: 12137410.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_decapoda-research__Antares-11b-v2 | ---
pretty_name: Evaluation run of decapoda-research/Antares-11b-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [decapoda-research/Antares-11b-v2](https://huggingface.co/decapoda-research/Antares-11b-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_decapoda-research__Antares-11b-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-09T16:39:51.423200](https://huggingface.co/datasets/open-llm-leaderboard/details_decapoda-research__Antares-11b-v2/blob/main/results_2024-02-09T16-39-51.423200.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6644514317819195,\n\
\ \"acc_stderr\": 0.03187434701699903,\n \"acc_norm\": 0.6660391055378342,\n\
\ \"acc_norm_stderr\": 0.03252257060439031,\n \"mc1\": 0.4394124847001224,\n\
\ \"mc1_stderr\": 0.017374520482513707,\n \"mc2\": 0.5916593502712777,\n\
\ \"mc2_stderr\": 0.01545426515730703\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6706484641638225,\n \"acc_stderr\": 0.013734057652635474,\n\
\ \"acc_norm\": 0.6902730375426621,\n \"acc_norm_stderr\": 0.013512058415238363\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6933877713602868,\n\
\ \"acc_stderr\": 0.004601446124041572,\n \"acc_norm\": 0.8754232224656443,\n\
\ \"acc_norm_stderr\": 0.0032956349076664654\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.67,\n\
\ \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \
\ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n\
\ \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.7916666666666666,\n\
\ \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266237,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266237\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4708994708994709,\n \"acc_stderr\": 0.02570765861415496,\n \"\
acc_norm\": 0.4708994708994709,\n \"acc_norm_stderr\": 0.02570765861415496\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8096774193548387,\n \"acc_stderr\": 0.022331707611823078,\n \"\
acc_norm\": 0.8096774193548387,\n \"acc_norm_stderr\": 0.022331707611823078\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n \"\
acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289715,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289715\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971128,\n\
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971128\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131143,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131143\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887044,\n\
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887044\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.015848255806501555,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.015848255806501555\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.02732547096671632,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.02732547096671632\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579647,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579647\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917671,\n \"\
acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917671\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489122,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489122\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.013740797258579832,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.013740797258579832\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.023357365785874037,\n\
\ \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.023357365785874037\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4659217877094972,\n\
\ \"acc_stderr\": 0.016683615837486863,\n \"acc_norm\": 0.4659217877094972,\n\
\ \"acc_norm_stderr\": 0.016683615837486863\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.752411575562701,\n\
\ \"acc_stderr\": 0.024513879973621967,\n \"acc_norm\": 0.752411575562701,\n\
\ \"acc_norm_stderr\": 0.024513879973621967\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4980443285528031,\n\
\ \"acc_stderr\": 0.012770138422208626,\n \"acc_norm\": 0.4980443285528031,\n\
\ \"acc_norm_stderr\": 0.012770138422208626\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.027678468642144717,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.027678468642144717\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6813725490196079,\n \"acc_stderr\": 0.018850084696468712,\n \
\ \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.018850084696468712\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.027529637440174937,\n\
\ \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.027529637440174937\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.028782108105401705,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.028782108105401705\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4394124847001224,\n\
\ \"mc1_stderr\": 0.017374520482513707,\n \"mc2\": 0.5916593502712777,\n\
\ \"mc2_stderr\": 0.01545426515730703\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8318863456985004,\n \"acc_stderr\": 0.010510336954166739\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6050037907505686,\n \
\ \"acc_stderr\": 0.0134653549699732\n }\n}\n```"
repo_url: https://huggingface.co/decapoda-research/Antares-11b-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|arc:challenge|25_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|gsm8k|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hellaswag|10_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T16-39-51.423200.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T16-39-51.423200.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- '**/details_harness|winogrande|5_2024-02-09T16-39-51.423200.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-09T16-39-51.423200.parquet'
- config_name: results
data_files:
- split: 2024_02_09T16_39_51.423200
path:
- results_2024-02-09T16-39-51.423200.parquet
- split: latest
path:
- results_2024-02-09T16-39-51.423200.parquet
---
# Dataset Card for Evaluation run of decapoda-research/Antares-11b-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [decapoda-research/Antares-11b-v2](https://huggingface.co/decapoda-research/Antares-11b-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_decapoda-research__Antares-11b-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-09T16:39:51.423200](https://huggingface.co/datasets/open-llm-leaderboard/details_decapoda-research__Antares-11b-v2/blob/main/results_2024-02-09T16-39-51.423200.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6644514317819195,
"acc_stderr": 0.03187434701699903,
"acc_norm": 0.6660391055378342,
"acc_norm_stderr": 0.03252257060439031,
"mc1": 0.4394124847001224,
"mc1_stderr": 0.017374520482513707,
"mc2": 0.5916593502712777,
"mc2_stderr": 0.01545426515730703
},
"harness|arc:challenge|25": {
"acc": 0.6706484641638225,
"acc_stderr": 0.013734057652635474,
"acc_norm": 0.6902730375426621,
"acc_norm_stderr": 0.013512058415238363
},
"harness|hellaswag|10": {
"acc": 0.6933877713602868,
"acc_stderr": 0.004601446124041572,
"acc_norm": 0.8754232224656443,
"acc_norm_stderr": 0.0032956349076664654
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7916666666666666,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.7916666666666666,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266237,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266237
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4708994708994709,
"acc_stderr": 0.02570765861415496,
"acc_norm": 0.4708994708994709,
"acc_norm_stderr": 0.02570765861415496
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8096774193548387,
"acc_stderr": 0.022331707611823078,
"acc_norm": 0.8096774193548387,
"acc_norm_stderr": 0.022331707611823078
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289715,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289715
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971128,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971128
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131143,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131143
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887044,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887044
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.015848255806501555,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.015848255806501555
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.02732547096671632,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.02732547096671632
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579647,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579647
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.03457272836917671,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.03457272836917671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037182,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037182
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489122,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489122
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579832,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579832
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.023357365785874037,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.023357365785874037
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4659217877094972,
"acc_stderr": 0.016683615837486863,
"acc_norm": 0.4659217877094972,
"acc_norm_stderr": 0.016683615837486863
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.752411575562701,
"acc_stderr": 0.024513879973621967,
"acc_norm": 0.752411575562701,
"acc_norm_stderr": 0.024513879973621967
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4980443285528031,
"acc_stderr": 0.012770138422208626,
"acc_norm": 0.4980443285528031,
"acc_norm_stderr": 0.012770138422208626
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.027678468642144717,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.027678468642144717
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.018850084696468712,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.018850084696468712
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.027529637440174937,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.027529637440174937
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896308,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896308
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.038695433234721015,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.038695433234721015
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.028782108105401705,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.028782108105401705
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4394124847001224,
"mc1_stderr": 0.017374520482513707,
"mc2": 0.5916593502712777,
"mc2_stderr": 0.01545426515730703
},
"harness|winogrande|5": {
"acc": 0.8318863456985004,
"acc_stderr": 0.010510336954166739
},
"harness|gsm8k|5": {
"acc": 0.6050037907505686,
"acc_stderr": 0.0134653549699732
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
anan-2024/twitter_dataset_1713000510 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 27125
num_examples: 60
download_size: 15416
dataset_size: 27125
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/r93_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of r93/R93/R93 (Girls' Frontline)
This is the dataset of r93/R93/R93 (Girls' Frontline), containing 45 images and their tags.
The core tags of this character are `green_eyes, pink_hair, breasts, long_hair, bangs, sunglasses, medium_breasts, eyewear_on_head`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 45 | 66.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/r93_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 45 | 34.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/r93_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 111 | 75.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/r93_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 45 | 57.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/r93_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 111 | 110.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/r93_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/r93_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, cleavage, solo, navel, official_alternate_costume, collarbone, looking_at_viewer, simple_background, white_background, white_bikini, black_bikini, blush, thigh_strap, bare_shoulders, choker, side-tie_bikini_bottom, side_ponytail, armpits, bag, closed_mouth, hair_between_eyes, jacket |
| 1 | 9 |  |  |  |  |  | 1girl, closed_mouth, smile, solo, simple_background, black_gloves, cleavage, looking_at_viewer, sniper_rifle, standing, white_background, dress, hair_ribbon, hairband, holding_gun, long_sleeves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | solo | navel | official_alternate_costume | collarbone | looking_at_viewer | simple_background | white_background | white_bikini | black_bikini | blush | thigh_strap | bare_shoulders | choker | side-tie_bikini_bottom | side_ponytail | armpits | bag | closed_mouth | hair_between_eyes | jacket | smile | black_gloves | sniper_rifle | standing | dress | hair_ribbon | hairband | holding_gun | long_sleeves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:-------|:--------|:-----------------------------|:-------------|:--------------------|:--------------------|:-------------------|:---------------|:---------------|:--------|:--------------|:-----------------|:---------|:-------------------------|:----------------|:----------|:------|:---------------|:--------------------|:---------|:--------|:---------------|:---------------|:-----------|:--------|:--------------|:-----------|:--------------|:---------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | X | | | | X | X | X | | | | | | | | | | | X | | | X | X | X | X | X | X | X | X | X |
|
Crystalcareai/WeakauraInfo | ---
license: apache-2.0
---
|
heliosprime/twitter_dataset_1713156758 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 11205
num_examples: 30
download_size: 13159
dataset_size: 11205
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713156758"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
victtin96/embaixador | ---
license: openrail
---
|
daqc/constitucion_politica_del_peru_1993_q_argilla | ---
size_categories: 1K<n<10K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for constitucion_politica_del_peru_1993_q_argilla
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("daqc/constitucion_politica_del_peru_1993_q_argilla")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("daqc/constitucion_politica_del_peru_1993_q_argilla")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/conceptual_guides/data_model.html#feedback-dataset) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, **metadata**, **vectors**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| input | input | text | True | True |
| instructions | instructions | text | True | False |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label_selection, multi_label_selection, or ranking.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| instruction-rating | How would you rate the generated instruction? | rating | True | N/A | [1, 2, 3, 4, 5, 6, 7, 8, 9, 10] |
| curated-instruction | accurate instruction | text | True | If you think the instruction is not accurate, please correct it.
If the original instruction is ok, copy and paste it here. | N/A |
The **suggestions** are human or machine generated recommendations for each question to assist the annotator during the annotation process, so those are always linked to the existing questions, and named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above, but the column name is appended with "-suggestion" and the metadata is appended with "-suggestion-metadata".
The **metadata** is a dictionary that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
**✨ NEW** The **vectors** are different columns that contain a vector in floating point, which is constraint to the pre-defined dimensions in the **vectors_settings** when configuring the vectors within the dataset itself, also the dimensions will always be 1-dimensional. The **vectors** are optional and identified by the pre-defined vector name in the dataset configuration file in `argilla.yaml`.
| Vector Name | Title | Dimensions |
|-------------|-------|------------|
| input | input | [1, 384] |
| instructions | instructions | [1, 384] |
| Metadata Name | Title | Type | Values | Visible for Annotators |
| ------------- | ----- | ---- | ------ | ---------------------- |
| length-input | length-input | integer | None - None | True |
| length-instruction | length-instruction | integer | None - None | True |
| input_n_tokens | Input N Tokens | integer | None - None | True |
| input_n_unique_tokens | Input N Unique Tokens | integer | None - None | True |
| input_n_sentences | Input N Sentences | integer | None - None | True |
| input_perplexity | Input Perplexity | float | None - None | True |
| input_entropy | Input Entropy | float | None - None | True |
| input_flesch_reading_ease | Input Flesch Reading Ease | float | None - None | True |
| instructions_n_tokens | Instructions N Tokens | integer | None - None | True |
| instructions_n_unique_tokens | Instructions N Unique Tokens | integer | None - None | True |
| instructions_n_sentences | Instructions N Sentences | integer | None - None | True |
| instructions_perplexity | Instructions Perplexity | float | None - None | True |
| instructions_entropy | Instructions Entropy | float | None - None | True |
| instructions_flesch_reading_ease | Instructions Flesch Reading Ease | float | None - None | True |
The **guidelines**, are optional as well, and are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"external_id": null,
"fields": {
"input": "CONSTITUCI\u00d3N POL\u00cdTICA DEL PER\u00da P R E \u00c1 M B U L O EL CONGRESO CONSTITUYENTE DEMOCR\u00c1TICO INVOCANDO A DIOS TODOPODEROSO OBEDECIENDO EL MANDATO DEL PUEBLO PERUANO Y RECORDANDO EL SACRIFICIO DE TODAS LAS GENERACIONES QUE NOS HAN PRECEDIDO EN NUESTRA PATRIA HA RESUELTO DAR LA SIGUIENTE CONSTITUCION T\u00cdTULO I DE",
"instructions": "\u00bfCu\u00e1l es el prop\u00f3sito del Pre\u00e1mbulo en la Constituci\u00f3n Pol\u00edtica del Per\u00fa?"
},
"metadata": {
"generation-model": "mistralai/Mixtral-8x7B-Instruct-v0.1",
"input_entropy": 0.09,
"input_flesch_reading_ease": 63.42,
"input_n_sentences": 7,
"input_n_tokens": 51,
"input_n_unique_tokens": 47,
"input_perplexity": 1.1,
"instructions_entropy": 0.03,
"instructions_flesch_reading_ease": 74.81,
"instructions_n_sentences": 1,
"instructions_n_tokens": 12,
"instructions_n_unique_tokens": 11,
"instructions_perplexity": 1.03,
"length-input": 305,
"length-instructions": 73
},
"responses": [],
"suggestions": [],
"vectors": {}
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"curated-instruction": [],
"curated-instruction-suggestion": null,
"curated-instruction-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"external_id": null,
"input": "CONSTITUCI\u00d3N POL\u00cdTICA DEL PER\u00da P R E \u00c1 M B U L O EL CONGRESO CONSTITUYENTE DEMOCR\u00c1TICO INVOCANDO A DIOS TODOPODEROSO OBEDECIENDO EL MANDATO DEL PUEBLO PERUANO Y RECORDANDO EL SACRIFICIO DE TODAS LAS GENERACIONES QUE NOS HAN PRECEDIDO EN NUESTRA PATRIA HA RESUELTO DAR LA SIGUIENTE CONSTITUCION T\u00cdTULO I DE",
"instruction-rating": [],
"instruction-rating-suggestion": null,
"instruction-rating-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"instructions": "\u00bfCu\u00e1l es el prop\u00f3sito del Pre\u00e1mbulo en la Constituci\u00f3n Pol\u00edtica del Per\u00fa?",
"metadata": "{\"length-input\": 305, \"length-instructions\": 73, \"generation-model\": \"mistralai/Mixtral-8x7B-Instruct-v0.1\", \"input_n_tokens\": 51, \"input_n_unique_tokens\": 47, \"input_n_sentences\": 7, \"input_perplexity\": 1.1, \"input_entropy\": 0.09, \"input_flesch_reading_ease\": 63.42, \"instructions_n_tokens\": 12, \"instructions_n_unique_tokens\": 11, \"instructions_n_sentences\": 1, \"instructions_perplexity\": 1.03, \"instructions_entropy\": 0.03, \"instructions_flesch_reading_ease\": 74.81}",
"vectors": {
"input": null,
"instructions": null
}
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
* **input** is of type `text`.
* **instructions** is of type `text`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **instruction-rating** is of type `rating` with the following allowed values [1, 2, 3, 4, 5, 6, 7, 8, 9, 10].
* **curated-instruction** is of type `text`, and description "If you think the instruction is not accurate, please correct it.
If the original instruction is ok, copy and paste it here.".
* **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **instruction-rating-suggestion** is of type `rating` with the following allowed values [1, 2, 3, 4, 5, 6, 7, 8, 9, 10].
* (optional) **curated-instruction-suggestion** is of type `text`.
* **✨ NEW** **Vectors**: As of Argilla 1.19.0, the vectors have been included in order to add support for similarity search to explore similar records based on vector search powered by the search engine defined. The vectors are optional and cannot be seen within the UI, those are uploaded and internally used. Also the vectors will always be optional, and only the dimensions previously defined in their settings.
* (optional) **input** is of type `float32` and has a dimension of (1, `384`).
* (optional) **instructions** is of type `float32` and has a dimension of (1, `384`).
Additionally, we also have two more fields that are optional and are the following:
* **metadata:** This is an optional field that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
The aim of the project is to correct the instructions to make sure they are of the highest quality.
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
anan-2024/twitter_dataset_1712985555 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 249444
num_examples: 682
download_size: 129969
dataset_size: 249444
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sezenkarakus/image-description-dataset-v2 | ---
dataset_info:
features:
- name: file_name
dtype: string
- name: text
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 6815664418.75
num_examples: 19610
download_size: 6811357830
dataset_size: 6815664418.75
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Filippo/distilabel-intel-orca-dpo-pairs-filtered | ---
dataset_info:
features:
- name: system
dtype: string
- name: question
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: generations
sequence: string
- name: order
sequence: string
- name: labelling_model
dtype: string
- name: labelling_prompt
list:
- name: content
dtype: string
- name: role
dtype: string
- name: raw_labelling_response
dtype: string
- name: rating
sequence: float64
- name: rationale
dtype: string
- name: status
dtype: string
- name: original_chosen
dtype: string
- name: original_rejected
dtype: string
- name: chosen_score
dtype: float64
- name: in_gsm8k_train
dtype: bool
splits:
- name: train
num_bytes: 67071699.50314955
num_examples: 5329
- name: test
num_bytes: 7463598.7625009725
num_examples: 593
download_size: 36944857
dataset_size: 74535298.26565053
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
tags:
- synthetic
- distilabel
---
|
alexshengzhili/SciCapInstructed-graph-only-qa | ---
license: mit
dataset_info:
features:
- name: image_file
dtype: string
- name: id
dtype: string
- name: caption
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: first_mention
dtype: string
- name: response
dtype: string
- name: title
dtype: string
- name: abstract
dtype: string
- name: q_a_pairs
sequence:
sequence: string
splits:
- name: 1_percent_as_validation
num_bytes: 16096860.454545455
num_examples: 3002
download_size: 7889034
dataset_size: 16096860.454545455
---
|
atmallen/quirky_sciq_pythia-410m_alice_hard | ---
dataset_info:
features:
- name: id
dtype: string
- name: choices
sequence: string
- name: label
dtype: int64
- name: difficulty
dtype: float64
- name: statement
dtype: string
- name: character
dtype: string
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: bob_log_odds
dtype: float64
splits:
- name: train
num_bytes: 3638308.4990153266
num_examples: 5840
- name: validation
num_bytes: 337632.39
num_examples: 548
- name: test
num_bytes: 297513.921
num_examples: 474
download_size: 1364408
dataset_size: 4273454.810015326
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
rassibassi/sample_mnist | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
'2': '2'
'3': '3'
'4': '4'
'5': '5'
'6': '6'
'7': '7'
'8': '8'
'9': '9'
splits:
- name: train
num_bytes: 3447853.0
num_examples: 12000
- name: test
num_bytes: 563331.0
num_examples: 2000
download_size: 3325934
dataset_size: 4011184.0
---
# Dataset Card for "sample_mnist"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Fiery06/fashion_image_caption-100-v2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 22820471.0
num_examples: 100
download_size: 22820373
dataset_size: 22820471.0
---
# Dataset Card for "fashion_image_caption-100-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
judy93536/perigon-200k | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 216299930.0087607
num_examples: 176584
- name: test
num_bytes: 38170719.9912393
num_examples: 31162
download_size: 129060894
dataset_size: 254470650.0
---
# Dataset Card for "perigon-200k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
marcones/locutoroficial | ---
license: openrail
---
|
Reymaaref/QAER | ---
task_categories:
- question-answering
language:
- en
tags:
- medical
--- |
ainzOulgun/fshdf | ---
license: openrail
---
|
houck2040/today_news | ---
license: mit
---
|
Nma/resume_dataset | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: test
num_bytes: 355695532
num_examples: 161071
- name: train
num_bytes: 1421896716
num_examples: 644282
download_size: 896434509
dataset_size: 1777592248
---
# Dataset Card for "resume_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyhuang/ShapeNet_Rendering | ---
license: apache-2.0
---
|
qanta | ---
annotations_creators:
- machine-generated
language:
- en
language_creators:
- found
license:
- unknown
multilinguality:
- monolingual
pretty_name: Quizbowl
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- question-answering
task_ids: []
paperswithcode_id: quizbowl
tags:
- quizbowl
dataset_info:
features:
- name: id
dtype: string
- name: qanta_id
dtype: int32
- name: proto_id
dtype: string
- name: qdb_id
dtype: int32
- name: dataset
dtype: string
- name: text
dtype: string
- name: full_question
dtype: string
- name: first_sentence
dtype: string
- name: char_idx
dtype: int32
- name: sentence_idx
dtype: int32
- name: tokenizations
sequence:
sequence: int32
length: 2
- name: answer
dtype: string
- name: page
dtype: string
- name: raw_answer
dtype: string
- name: fold
dtype: string
- name: gameplay
dtype: bool
- name: category
dtype: string
- name: subcategory
dtype: string
- name: tournament
dtype: string
- name: difficulty
dtype: string
- name: year
dtype: int32
config_name: mode=first,char_skip=25
splits:
- name: adversarial
num_bytes: 1258844
num_examples: 1145
- name: buzzdev
num_bytes: 1553636
num_examples: 1161
- name: buzztest
num_bytes: 2653425
num_examples: 1953
- name: buzztrain
num_bytes: 19699736
num_examples: 16706
- name: guessdev
num_bytes: 1414882
num_examples: 1055
- name: guesstest
num_bytes: 2997123
num_examples: 2151
- name: guesstrain
num_bytes: 117599750
num_examples: 96221
download_size: 170754918
dataset_size: 147177396
---
# Dataset Card for "qanta"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [http://www.qanta.org/](http://www.qanta.org/)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [Quizbowl: The Case for Incremental Question Answering](https://arxiv.org/abs/1904.04792)
- **Point of Contact:** [Jordan Boyd-Graber](mailto:jbg@umiacs.umd.edu)
- **Size of downloaded dataset files:** 170.75 MB
- **Size of the generated dataset:** 147.18 MB
- **Total amount of disk used:** 317.93 MB
### Dataset Summary
The Qanta dataset is a question answering dataset based on the academic trivia game Quizbowl.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### mode=first,char_skip=25
- **Size of downloaded dataset files:** 170.75 MB
- **Size of the generated dataset:** 147.18 MB
- **Total amount of disk used:** 317.93 MB
An example of 'guessdev' looks as follows.
```
This example was too long and was cropped:
{
"answer": "Apollo_program",
"category": "History",
"char_idx": -1,
"dataset": "quizdb.org",
"difficulty": "easy_college",
"first_sentence": "As part of this program, William Anders took a photo that Galen Rowell called \"the most influential environmental photograph ever taken.\"",
"fold": "guessdev",
"full_question": "\"As part of this program, William Anders took a photo that Galen Rowell called \\\"the most influential environmental photograph e...",
"gameplay": false,
"id": "127028-first",
"page": "Apollo_program",
"proto_id": "",
"qanta_id": 127028,
"qdb_id": 126689,
"raw_answer": "Apollo program [or Project Apollo; accept Apollo 8; accept Apollo 1; accept Apollo 11; prompt on landing on the moon]",
"sentence_idx": -1,
"subcategory": "American",
"text": "As part of this program, William Anders took a photo that Galen Rowell called \"the most influential environmental photograph ever taken.\"",
"tokenizations": [[0, 137], [138, 281], [282, 412], [413, 592], [593, 675]],
"tournament": "ACF Fall",
"year": 2016
}
```
### Data Fields
The data fields are the same among all splits.
#### mode=first,char_skip=25
- `id`: a `string` feature.
- `qanta_id`: a `int32` feature.
- `proto_id`: a `string` feature.
- `qdb_id`: a `int32` feature.
- `dataset`: a `string` feature.
- `text`: a `string` feature.
- `full_question`: a `string` feature.
- `first_sentence`: a `string` feature.
- `char_idx`: a `int32` feature.
- `sentence_idx`: a `int32` feature.
- `tokenizations`: a dictionary feature containing:
- `feature`: a `int32` feature.
- `answer`: a `string` feature.
- `page`: a `string` feature.
- `raw_answer`: a `string` feature.
- `fold`: a `string` feature.
- `gameplay`: a `bool` feature.
- `category`: a `string` feature.
- `subcategory`: a `string` feature.
- `tournament`: a `string` feature.
- `difficulty`: a `string` feature.
- `year`: a `int32` feature.
### Data Splits
| name |adversarial|buzzdev|buzztrain|guessdev|guesstrain|buzztest|guesstest|
|-----------------------|----------:|------:|--------:|-------:|---------:|-------:|--------:|
|mode=first,char_skip=25| 1145| 1161| 16706| 1055| 96221| 1953| 2151|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@article{Rodriguez2019QuizbowlTC,
title={Quizbowl: The Case for Incremental Question Answering},
author={Pedro Rodriguez and Shi Feng and Mohit Iyyer and He He and Jordan L. Boyd-Graber},
journal={ArXiv},
year={2019},
volume={abs/1904.04792}
}
```
### Contributions
Thanks to [@thomwolf](https://github.com/thomwolf), [@patrickvonplaten](https://github.com/patrickvonplaten), [@lewtun](https://github.com/lewtun) for adding this dataset. |
SaffalPoosh/HR-VITON | ---
dataset_info:
features:
- name: agnostic-v3.2
dtype: image
- name: cloth-mask
dtype: image
- name: image-densepose
dtype: image
- name: image-parse-v3
dtype: image
- name: openpose_json
dtype: string
- name: cloth
dtype: image
- name: image
dtype: image
- name: image-parse-agnostic-v3.2
dtype: image
- name: openpose_img
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 4512037155.566
num_examples: 11647
download_size: 4140730000
dataset_size: 4512037155.566
---
# Dataset Card for "HR-VITON"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/leona_nikke | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of leona/レオナ/莱昂纳/레오나 (Nikke: Goddess of Victory)
This is the dataset of leona/レオナ/莱昂纳/레오나 (Nikke: Goddess of Victory), containing 33 images and their tags.
The core tags of this character are `animal_ears, breasts, long_hair, pink_hair, large_breasts, bangs, yellow_eyes, animal_ear_fluff`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 33 | 53.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leona_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 33 | 25.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leona_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 82 | 57.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leona_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 33 | 45.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leona_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 82 | 89.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leona_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/leona_nikke',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------|
| 0 | 33 |  |  |  |  |  | 1girl, blush, solo, looking_at_viewer, detached_sleeves, bare_shoulders, smile, open_mouth, sideboob, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | solo | looking_at_viewer | detached_sleeves | bare_shoulders | smile | open_mouth | sideboob | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:--------------------|:-------------------|:-----------------|:--------|:-------------|:-----------|:-------------------|
| 0 | 33 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X |
|
SiguienteGlobal/dpo-mix | ---
dataset_info:
features:
- name: source
dtype: string
- name: conversation
list:
- name: input
dtype: string
- name: output
dtype: string
- name: original_response
dtype: string
- name: generation_prompt
sequence: string
- name: raw_generation_responses
sequence: string
- name: new_generations
sequence: string
- name: prompt
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rating_chosen
dtype: int64
- name: rating_rejected
dtype: int64
- name: chosen_model
dtype: string
- name: rejected_model
dtype: string
- name: turns
dtype: int64
- name: dataset
dtype: string
- name: chosen-rating
dtype: float64
- name: chosen-model
dtype: string
- name: rejected-rating
dtype: float64
- name: rejected-model
dtype: string
- name: system
dtype: string
- name: question
dtype: string
- name: generations
sequence: string
- name: order
sequence: string
- name: labelling_model
dtype: string
- name: labelling_prompt
list:
- name: content
dtype: string
- name: role
dtype: string
- name: raw_labelling_response
dtype: string
- name: rating
sequence: float64
- name: rationale
dtype: string
- name: status
dtype: string
- name: original_chosen
dtype: string
- name: original_rejected
dtype: string
- name: chosen_score
dtype: float64
- name: in_gsm8k_train
dtype: bool
splits:
- name: train
num_bytes: 6031995.32
num_examples: 270
download_size: 2688679
dataset_size: 6031995.32
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ResplendentAI/Alpaca_NSFW_Shuffled | ---
license: cc-by-nc-4.0
language:
- en
tags:
- not-for-all-audiences
pretty_name: Alpaca NSFW Shuffled
size_categories:
- n<1K
---
Reformatted and pruned this dataset: https://huggingface.co/datasets/athirdpath/DPO_Pairs-Roleplay-Alpaca-NSFW-v1-SHUFFLED |
AdapterOcean/med_alpaca_standardized_cluster_12_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 31270389
num_examples: 49396
download_size: 15769729
dataset_size: 31270389
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_12_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Seenka/directv-zocalos-new-test-3fps | ---
dataset_info:
features:
- name: image
dtype: image
- name: timestamp
dtype: time64[us]
- name: video_storage_path
dtype: string
- name: zocalo_id
dtype: string
- name: frame_number
dtype: int64
splits:
- name: train
num_bytes: 4882626.0
num_examples: 15
download_size: 4795528
dataset_size: 4882626.0
---
# Dataset Card for "directv-zocalos-new-test-3fps"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
McSpicyWithMilo/target-elements-0.2split-new-move-validation | ---
dataset_info:
features:
- name: target_element
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 10459.2
num_examples: 80
- name: test
num_bytes: 1307.4
num_examples: 10
- name: valid
num_bytes: 1307.4
num_examples: 10
download_size: 12345
dataset_size: 13074.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
# Dataset Card for "target-elements-0.2split-new-move-validation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alex1qaz/goodsmemo | ---
license: openrail
---
|
jholst/reuters_articles | ---
dataset_info:
features:
- name: title
dtype: string
- name: body
dtype: string
splits:
- name: train
num_bytes: 13792576
num_examples: 17262
- name: validation
num_bytes: 1870389
num_examples: 2158
- name: test
num_bytes: 1379190
num_examples: 2158
download_size: 10073414
dataset_size: 17042155
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
olm/olm-CC-MAIN-2022-27-sampling-ratio-0.16142697881 | ---
annotations_creators:
- no-annotation
language:
- en
language_creators:
- found
license: []
multilinguality:
- monolingual
pretty_name: OLM June/July 2022 Common Crawl
size_categories:
- 10M<n<100M
source_datasets: []
tags:
- pretraining
- language modelling
- common crawl
- web
task_categories: []
task_ids: []
---
# Dataset Card for OLM June/July 2022 Common Crawl
Cleaned and deduplicated pretraining dataset, created with the OLM repo [here](https://github.com/huggingface/olm-datasets) from 16% of the June/July 2022 Common Crawl snapshot.
Note: `last_modified_timestamp` was parsed from whatever a website returned in it's `Last-Modified` header; there are likely a small number of outliers that are incorrect, so we recommend removing the outliers before doing statistics with `last_modified_timestamp`. |
jacobbieker/era5-42hour | ---
license: mit
---
|
nguyentruong-ins/nhlcoding_cleaned_cpp_dataset | ---
dataset_info:
features:
- name: solution
dtype: string
- name: difficulty
dtype: int64
splits:
- name: train
num_bytes: 1660958429.1502597
num_examples: 1386155
- name: test
num_bytes: 207620552.54922104
num_examples: 173270
- name: valid
num_bytes: 207619354.30051932
num_examples: 173269
download_size: 904311848
dataset_size: 2076198336.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
CVasNLPExperiments/Hatefulmemes_test_google_flan_t5_xxl_mode_C_T_A_OCR_rices_ns_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
sequence: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large__text
num_bytes: 1120441
num_examples: 1000
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full__text
num_bytes: 1322886
num_examples: 1000
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full__text
num_bytes: 1298326
num_examples: 1000
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_wordnet_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full__text
num_bytes: 1326595
num_examples: 1000
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_wordnet_blip_caption_5_Salesforce_blip_image_captioning_large_max_length_30_hf__text
num_bytes: 1702041
num_examples: 1000
- name: fewshot_0
num_bytes: 1175018
num_examples: 1000
download_size: 1053241
dataset_size: 7945307
---
# Dataset Card for "Hatefulmemes_test_google_flan_t5_xxl_mode_C_T_A_OCR_rices_ns_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shauray/Shkreli-LoRA | ---
license: mit
---
|
suolyer/pile_enron | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_liminerity__phigment6-slerp | ---
pretty_name: Evaluation run of liminerity/phigment6-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [liminerity/phigment6-slerp](https://huggingface.co/liminerity/phigment6-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_liminerity__phigment6-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-29T12:59:01.551696](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__phigment6-slerp/blob/main/results_2024-02-29T12-59-01.551696.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5892303175337578,\n\
\ \"acc_stderr\": 0.033687856964891474,\n \"acc_norm\": 0.5902843276007427,\n\
\ \"acc_norm_stderr\": 0.034375351421247244,\n \"mc1\": 0.34761321909424725,\n\
\ \"mc1_stderr\": 0.016670769188897303,\n \"mc2\": 0.5048822092511128,\n\
\ \"mc2_stderr\": 0.01550679758683007\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6006825938566553,\n \"acc_stderr\": 0.014312094557946712,\n\
\ \"acc_norm\": 0.6262798634812287,\n \"acc_norm_stderr\": 0.014137708601759075\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.589523999203346,\n\
\ \"acc_stderr\": 0.004909148239488281,\n \"acc_norm\": 0.7724556861183032,\n\
\ \"acc_norm_stderr\": 0.00418390001445079\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6264150943396226,\n \"acc_stderr\": 0.029773082713319875,\n\
\ \"acc_norm\": 0.6264150943396226,\n \"acc_norm_stderr\": 0.029773082713319875\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.42,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n\
\ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n\
\ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4978723404255319,\n \"acc_stderr\": 0.032685726586674915,\n\
\ \"acc_norm\": 0.4978723404255319,\n \"acc_norm_stderr\": 0.032685726586674915\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n\
\ \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n\
\ \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4523809523809524,\n \"acc_stderr\": 0.02563425811555495,\n \"\
acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.02563425811555495\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6967741935483871,\n\
\ \"acc_stderr\": 0.026148685930671742,\n \"acc_norm\": 0.6967741935483871,\n\
\ \"acc_norm_stderr\": 0.026148685930671742\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.035014387062967806,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.035014387062967806\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.0303137105381989,\n \"acc_norm\"\
: 0.7626262626262627,\n \"acc_norm_stderr\": 0.0303137105381989\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8134715025906736,\n \"acc_stderr\": 0.028112091210117453,\n\
\ \"acc_norm\": 0.8134715025906736,\n \"acc_norm_stderr\": 0.028112091210117453\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5948717948717949,\n \"acc_stderr\": 0.024890471769938145,\n\
\ \"acc_norm\": 0.5948717948717949,\n \"acc_norm_stderr\": 0.024890471769938145\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683526,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683526\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.03128217706368461,\n \
\ \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.03128217706368461\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.039837983066598075,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.039837983066598075\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8073394495412844,\n \"acc_stderr\": 0.016909276884936087,\n \"\
acc_norm\": 0.8073394495412844,\n \"acc_norm_stderr\": 0.016909276884936087\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7009803921568627,\n \"acc_stderr\": 0.03213325717373618,\n \"\
acc_norm\": 0.7009803921568627,\n \"acc_norm_stderr\": 0.03213325717373618\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7215189873417721,\n \"acc_stderr\": 0.02917868230484255,\n \
\ \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.02917868230484255\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n\
\ \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.6322869955156951,\n\
\ \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n\
\ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.0413311944024384,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.0413311944024384\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.03351953879521269,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.03351953879521269\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.02624677294689048,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.02624677294689048\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6922094508301405,\n\
\ \"acc_stderr\": 0.016506045045155633,\n \"acc_norm\": 0.6922094508301405,\n\
\ \"acc_norm_stderr\": 0.016506045045155633\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.025070713719153193,\n\
\ \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.025070713719153193\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3016759776536313,\n\
\ \"acc_stderr\": 0.015350767572220286,\n \"acc_norm\": 0.3016759776536313,\n\
\ \"acc_norm_stderr\": 0.015350767572220286\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6437908496732027,\n \"acc_stderr\": 0.02742047766262924,\n\
\ \"acc_norm\": 0.6437908496732027,\n \"acc_norm_stderr\": 0.02742047766262924\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6270096463022508,\n\
\ \"acc_stderr\": 0.027466610213140105,\n \"acc_norm\": 0.6270096463022508,\n\
\ \"acc_norm_stderr\": 0.027466610213140105\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.654320987654321,\n \"acc_stderr\": 0.02646248777700187,\n\
\ \"acc_norm\": 0.654320987654321,\n \"acc_norm_stderr\": 0.02646248777700187\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.425531914893617,\n \"acc_stderr\": 0.029494827600144366,\n \
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.029494827600144366\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41199478487614083,\n\
\ \"acc_stderr\": 0.012570871032146073,\n \"acc_norm\": 0.41199478487614083,\n\
\ \"acc_norm_stderr\": 0.012570871032146073\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5441176470588235,\n \"acc_stderr\": 0.030254372573976715,\n\
\ \"acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.030254372573976715\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5637254901960784,\n \"acc_stderr\": 0.020062874243539128,\n \
\ \"acc_norm\": 0.5637254901960784,\n \"acc_norm_stderr\": 0.020062874243539128\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065674,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065674\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7860696517412935,\n\
\ \"acc_stderr\": 0.028996909693328927,\n \"acc_norm\": 0.7860696517412935,\n\
\ \"acc_norm_stderr\": 0.028996909693328927\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.034462962170884265,\n\
\ \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.034462962170884265\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34761321909424725,\n\
\ \"mc1_stderr\": 0.016670769188897303,\n \"mc2\": 0.5048822092511128,\n\
\ \"mc2_stderr\": 0.01550679758683007\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7387529597474349,\n \"acc_stderr\": 0.012346914863415305\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5860500379075056,\n \
\ \"acc_stderr\": 0.01356699196015177\n }\n}\n```"
repo_url: https://huggingface.co/liminerity/phigment6-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|arc:challenge|25_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|gsm8k|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hellaswag|10_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T12-59-01.551696.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T12-59-01.551696.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- '**/details_harness|winogrande|5_2024-02-29T12-59-01.551696.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-29T12-59-01.551696.parquet'
- config_name: results
data_files:
- split: 2024_02_29T12_59_01.551696
path:
- results_2024-02-29T12-59-01.551696.parquet
- split: latest
path:
- results_2024-02-29T12-59-01.551696.parquet
---
# Dataset Card for Evaluation run of liminerity/phigment6-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [liminerity/phigment6-slerp](https://huggingface.co/liminerity/phigment6-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_liminerity__phigment6-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-29T12:59:01.551696](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__phigment6-slerp/blob/main/results_2024-02-29T12-59-01.551696.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5892303175337578,
"acc_stderr": 0.033687856964891474,
"acc_norm": 0.5902843276007427,
"acc_norm_stderr": 0.034375351421247244,
"mc1": 0.34761321909424725,
"mc1_stderr": 0.016670769188897303,
"mc2": 0.5048822092511128,
"mc2_stderr": 0.01550679758683007
},
"harness|arc:challenge|25": {
"acc": 0.6006825938566553,
"acc_stderr": 0.014312094557946712,
"acc_norm": 0.6262798634812287,
"acc_norm_stderr": 0.014137708601759075
},
"harness|hellaswag|10": {
"acc": 0.589523999203346,
"acc_stderr": 0.004909148239488281,
"acc_norm": 0.7724556861183032,
"acc_norm_stderr": 0.00418390001445079
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6264150943396226,
"acc_stderr": 0.029773082713319875,
"acc_norm": 0.6264150943396226,
"acc_norm_stderr": 0.029773082713319875
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4978723404255319,
"acc_stderr": 0.032685726586674915,
"acc_norm": 0.4978723404255319,
"acc_norm_stderr": 0.032685726586674915
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.02563425811555495,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.02563425811555495
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6967741935483871,
"acc_stderr": 0.026148685930671742,
"acc_norm": 0.6967741935483871,
"acc_norm_stderr": 0.026148685930671742
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.035014387062967806,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.035014387062967806
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.0303137105381989,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.0303137105381989
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8134715025906736,
"acc_stderr": 0.028112091210117453,
"acc_norm": 0.8134715025906736,
"acc_norm_stderr": 0.028112091210117453
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5948717948717949,
"acc_stderr": 0.024890471769938145,
"acc_norm": 0.5948717948717949,
"acc_norm_stderr": 0.024890471769938145
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683526,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683526
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.03128217706368461,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.03128217706368461
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.039837983066598075,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.039837983066598075
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8073394495412844,
"acc_stderr": 0.016909276884936087,
"acc_norm": 0.8073394495412844,
"acc_norm_stderr": 0.016909276884936087
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7009803921568627,
"acc_stderr": 0.03213325717373618,
"acc_norm": 0.7009803921568627,
"acc_norm_stderr": 0.03213325717373618
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7215189873417721,
"acc_stderr": 0.02917868230484255,
"acc_norm": 0.7215189873417721,
"acc_norm_stderr": 0.02917868230484255
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.0413311944024384,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.0413311944024384
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.03351953879521269,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.03351953879521269
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.02624677294689048,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.02624677294689048
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6922094508301405,
"acc_stderr": 0.016506045045155633,
"acc_norm": 0.6922094508301405,
"acc_norm_stderr": 0.016506045045155633
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.025070713719153193,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.025070713719153193
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3016759776536313,
"acc_stderr": 0.015350767572220286,
"acc_norm": 0.3016759776536313,
"acc_norm_stderr": 0.015350767572220286
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6437908496732027,
"acc_stderr": 0.02742047766262924,
"acc_norm": 0.6437908496732027,
"acc_norm_stderr": 0.02742047766262924
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6270096463022508,
"acc_stderr": 0.027466610213140105,
"acc_norm": 0.6270096463022508,
"acc_norm_stderr": 0.027466610213140105
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.654320987654321,
"acc_stderr": 0.02646248777700187,
"acc_norm": 0.654320987654321,
"acc_norm_stderr": 0.02646248777700187
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.029494827600144366,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.029494827600144366
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41199478487614083,
"acc_stderr": 0.012570871032146073,
"acc_norm": 0.41199478487614083,
"acc_norm_stderr": 0.012570871032146073
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.030254372573976715,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.030254372573976715
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5637254901960784,
"acc_stderr": 0.020062874243539128,
"acc_norm": 0.5637254901960784,
"acc_norm_stderr": 0.020062874243539128
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065674,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065674
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7860696517412935,
"acc_stderr": 0.028996909693328927,
"acc_norm": 0.7860696517412935,
"acc_norm_stderr": 0.028996909693328927
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7192982456140351,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.7192982456140351,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34761321909424725,
"mc1_stderr": 0.016670769188897303,
"mc2": 0.5048822092511128,
"mc2_stderr": 0.01550679758683007
},
"harness|winogrande|5": {
"acc": 0.7387529597474349,
"acc_stderr": 0.012346914863415305
},
"harness|gsm8k|5": {
"acc": 0.5860500379075056,
"acc_stderr": 0.01356699196015177
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CVasNLPExperiments/VQAv2_sample_validation_google_flan_t5_xl_mode_A_D_PNP_GENERIC_C_Q_rices_ns_200 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large__
num_bytes: 28609
num_examples: 200
download_size: 14030
dataset_size: 28609
---
# Dataset Card for "VQAv2_sample_validation_google_flan_t5_xl_mode_A_D_PNP_GENERIC_C_Q_rices_ns_200"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
akoksal/LongForm | ---
license: mit
task_categories:
- table-question-answering
- summarization
- text2text-generation
- text-generation
- question-answering
language:
- en
pretty_name: longform
paperswithcode_id: longform
size_categories:
- 10K<n<100K
---
# LongForm
The LongForm dataset is created by leveraging English corpus
examples with reverse instructions. We select a
diverse set of human-written
documents from existing corpora such as C4 and
Wikipedia and generate instructions for the given
documents via LLMs. Then, we extend these examples with structured corpora examples such as Stack Exchange and WikiHow and task examples such as question answering, email writing, grammar error correction, story/poem generation, and text summarization.

## Distribution
The distribution of the LongForm dataset in terms of the source of examples is below. It contains examples generated from raw text corpora via LLMs, structured corpus examples, as well as various NLP task examples such as email writing, grammar error correction, story/poem generation, and text summarization.
| **Type** | **Source** | **Number of Examples** |
|------------------------|----------------|------------------------|
| **Corpora** | C4 | 10,000 |
| | Wikipedia | 5,000 |
| **Structured Corpora** | Stack Exchange | 4,380 |
| | WikiHow | 2,500 |
| **Tasks** | NIv2 | 3,684 |
| | Big Bench | 600 |
| | BEA-GEC | 1,203 |
| | Enron | 372 |
| **Total** | | 27,739 |
| | | |
| **Train** | | 23,652 |
| **Validation** | | 2,042 |
| **Test** | | 2,045 |
## Models
| | **All** | **Recipe Generation** | **ELI5** | **Writing Prompts** |
|-----------------------|---------|-----------------------------------|----------|---------------------|
| **T0++** | 10.9 | 18.7 | 3.8 | 10.2 |
| **Tk-Instruct** | 6.3 | 12.9* | 3.6 | 2.4 |
| **Flan-T5** | 10.6 | 20.9* | 3.5 | 7.4 |
| **Alpaca-LLaMA-7B** | 14.6 | 19.5 | 12.5 | 11.8 |
| **OPT-30B** | 11.1 | 18.6 | 12.2 | 2.6 |
| **[LongForm-T5-XL](https://huggingface.co/akoksal/LongForm-T5-XL)** | 16.3 | 20.2 | 18.3 | 10.6 |
| **[LongForm-OPT-6.7B](https://huggingface.co/akoksal/LongForm-OPT-6.7B)** | 17.7 | 16.9 | 17.2 | 19.0 |
| [**LongForm-LLaMA-7B**](https://huggingface.co/akoksal/LongForm-LLaMA-7B-diff)‡ | **19.7** | **21.7** | **18.6** | 18.9 |
‡: We can just release the difference between LongForm-LLaMA-7B and pretrained LLaMA-7B publicly due to restrictions of LLaMA models.
## Examples
| **Source** | **Instruction** | **Output**|
|------------------------|----------------|------------------------|
| C4 | What is the likelihood that Donald Trump's HHS secretary nominee, Alex Azar, will push policies that benefit pharmaceutical companies' pocketbooks? | His record suggests he’s unlikely to push pricing policies that would hit pharmaceutical companies’ pocketbooks.<br />President Donald Trump tweeted Monday that his nominee for HHS secretary, Alex Azar will “be a star for … lower drug prices!” But the record of the former top executive for Eli Lilly, which tripled the price of a top-selling insulin drug while he led its U.S. operation, suggests a different story.<br />Lilly is one of three drug companies targeted by a class-action lawsuit that accuses the company[...]<br />“This nominee has impeccable big pharma credentials but non-existent credentials when it comes to concerns on drug prices,” said Peter Welch, co-chair of the House Democrats Drug Pricing Task Force.
|Wikipedia | "Where is San Saba located in Rome, Italy?" Respond in 2 sentences. |San Saba is an ancient basilica church in Rome, Italy. It lies on the so-called "Piccolo Aventino", which is an area close to the ancient Aurelian Walls next to the Aventine Hill and Caelian Hill.|
## Dataset Description
- **Repository:** https://github.com/akoksal/LongForm
- **Paper:** https://arxiv.org/abs/2304.08460
- **Version:** v1.0 - April 18, 2023
- **Contact:** [Abdullatif Köksal](https://twitter.com/akoksal_)
## License
The LongForm project is subject to a MIT License with custom limitations for restrictions imposed by OpenAI (for the instruction generation part), as well as the license of language models (OPT, LLaMA, and T5). The WikiHow subset of LongForm-C is subject to the license proposed by WikiHow.
## Citation
```
@misc{koksal2023longform,
title={LongForm: Effective Instruction Tuning with Reverse Instructions},
author={Abdullatif Köksal and Timo Schick and Anna Korhonen and Hinrich Schütze},
year={2023},
eprint={2304.08460},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
mertbozkurt/turkish-recipe | ---
license: mit
task_categories:
- question-answering
- conversational
- text-generation
language:
- tr
size_categories:
- 1K<n<10K
---
# Datasets Summary
The data set contains Turkish food recipes. It includes: title, url, category, required materials and how to make it.
# Languages
The dataset is based on Turkish.
# Data Instances for datav2.csv
* Title : Tavuklu Zade Kebabı,
* Link: https://ye-mek.net/tarif/tavuklu-zade-kebabi,
* Category: Ana-Yemek,
* Materials: "['4 adet orta boy kemer patlıcan', '500 gr kuşbaşı doğranmış tavuk göğsü', '2 adet orta boy patates', '1 adet orta boy soğan', '2 adet yeşil biber', '1 adet orta boy domates', '2 diş sarımsak', '1 tatlı kaşığıdomates salçası', '5 yemek kaşığı zeytinyağı', 'Tuz', 'Karabiber', 'Üzeri İçin:', 'Rendelenmiş kaşar peynir']",
* How to do: "Tavuklu zade kebabı yapımı için; geniş bir tencere içine 4-5 yemek kaşığı zeytinyağı döküp, ısıtın. Isınan yağın üzerine 500 gr kuşbaşı doğranmış tavuk etini koyun. Suyunu salıp, hafifçe çekene kadar pişirin.Daha sonra tavuk etlerinin üzerine 1 adet orta boy ince ince doğranmış soğan ve 2 adet küçük küçük doğranmış yeşil biberi ekleyin. 2-3 dakika ara ara karıştırarak, pişirmeye devam edin. Ardından tencereye 1 tatlı kaşığı domates salçası ve 1 adet orta boy ince ince doğranmış domates koyup, 1-2 dakika güzelce kavurun. Son olarak tavuklu harcın üzerine damak tadınıza göre tuz ve karabiber koyup, karıştırın. Tencerenin kapağını kapatıp, kısık ateş üzerinde domatesler yumuşayana kadar pişirin.Diğer tarafta 2 adet orta boy patatesin kabuğunu soyup, çok küçük olmayacak şekilde küpler halinde doğrayın. Doğradığınız patatesleri kızgın yağ içinde güzelce kızartın. Daha sonra patateslerin yağını iyice süzüp, hazırladığınız tavuklu harcın üzerine koyun. Tüm harcı güzelce karıştırıp, kenara alın.Daha sonra 4 adet orta büyüklükteki kemer patlıcanı alacalı olarak soyup, sap kısımlarını kesin. Bıçak yardımı ile uzunlamasına çok kalın ve ince olmayacak şekilde dilimleyin. Dilimlediğiniz patlıcanları kızgın yağ içinde arkalı önlü kızartın. Kızaran patlıcanları kağıt havlu üzerine alıp, yağlarının süzülmesini sağlayın.Diğer tarafta kızarttığınız patlıcanlardan 6 dilimini alıp, yarısı dışarıda kalacak şekilde orta boy bir kase içine biraz aralıklı olacak şekilde dizin. Patlıcanların orta kısmına tavuklu patates harcından koyun. Dışarı sarkan patlıcanları harcın üzerine güzelce kapatın. Ardından kaseyi diğer eliniz ile tutarak dikkatli bir şekilde ters çevirin. Kaseden çıkan tavuklu zade kebabını bir fırın kabı içine koyun. Üzerlerine rendelenmiş kaşar peynir serpiştirin. Önceden ısıtılmış 190 derece fırına verin. Üzeri hafifçe kızarana kadar yaklaşık 15 dakika pişirin.Tavuklu zade kebabı piştikten sonra fırından çıkartıp, sıcak olarak servis edebilirsiniz."
# Data Instances for datav3.txt
Sodalı Köfte nasıl yapılır?
Sodalı Köfte için gerekli malzemeler:
500 gr kıyma
1 adet büyük boy kuru soğan
1/2 çay bardağıgaleta unu
1 tatlı kaşığı tuz
1 çay kaşığı dolusu kırmızı toz biber
1 çay kaşığı kırmızı pul biber
1 çay kaşığı kimyon
1/2 çay kaşığı karabiber
1/2 paket kabartma tozu
1 çay bardağı soda
Sodalı Köfte Yapılışı:
Sodalı köfte yapımı için derin bir kap içine 1 adet büyük boy soğan rendeleyin.
Rendelediğiniz soğanın suyu varsa suyunu süzün.
Ardından üzerine yarım kilo kıyma koyun.
Daha sonra kaba yarım çay bardağı galeta unu, 1 tatlı kaşığı tuz, 1 çay kaşığı dolusu kırmızı toz biber, 1 çay kaşığı kırmızı pul biber, 1 çay kaşığı kimyon, yarım çay kaşığı karabiber ve yarım paket kabartma tozu koyun.
Son olarak köfteli harcın üzerine 1 çay bardağı soda dökün.
Tüm köfte harcını eliniz ile iyice yoğurun.
Hazırladığınız sodalı köfte harcını buzdolabından en az 1 saat dinlenmeye bırakın.Daha sonra dinlenen köfte harcından ceviz büyüklüğünde parçalar alıp, elinizde yuvarlak ya da oval şeklini verin.
Şekil verdiğiniz köfteleri yağlı kağıt serili fırın tepsisi içine dizin.
Köftelerin yanına isteğe göre birkaç domates ve biber koyabilirsiniz.Sodalı köfteleri önceden ısıtılmış 200 derece fırına verin.
Üzerleri güzelce kızarana kadar pişirin.Fırında sodalı köfteleriniz piştikten sonra sıcak olarak servis edebilirsiniz.
# COLLECTION METHODOLOGY
Python Web Scraping with BeautifulSoup
# Source
The source of the recipes is https://ye-mek.net |
101arrowz/vox_celeb | ---
annotations_creators:
- crowdsourced
language: []
language_creators:
- crowdsourced
license:
- cc-by-4.0
multilinguality:
- multilingual
pretty_name: VoxCeleb
size_categories:
- 1K<n<10K
- 10K<n<100K
- 100K<n<1M
source_datasets: []
tags: []
task_categories:
- automatic-speech-recognition
- audio-classification
- image-classification
task_ids:
- speaker-identification
---
# Dataset Card for VoxCeleb
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
### Dataset Summary
VoxCeleb is an audio-visual dataset consisting of short clips of human speech, extracted from interview videos uploaded to YouTube.
NOTE: Although this dataset can be automatically downloaded, you must manually request credentials to access it from the creators' website.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
Each datapoint has a path to the audio/video clip along with metadata about the speaker.
```
{
'file': '/datasets/downloads/extracted/[hash]/wav/id10271/_YimahVgI1A/00003.wav',
'file_format': 'wav',
'dataset_id': 'vox1',
'speaker_id': 'id10271',
'speaker_gender': 'm',
'speaker_name': 'Ed_Westwick',
'speaker_nationality': 'UK',
'video_id': '_YimahVgI1A',
'clip_id': '00003',
'audio': {
'path': '/datasets/downloads/extracted/[hash]/wav/id10271/_YimahVgI1A/00003.wav',
'array': array([...], dtype=float32),
'sampling_rate': 16000
}
}
```
### Data Fields
Each row includes the following fields:
- `file`: The path to the audio/video clip
- `file_format`: The file format in which the clip is stored (e.g. `wav`, `aac`, `mp4`)
- `dataset_id`: The ID of the dataset this clip is from (`vox1`, `vox2`)
- `speaker_id`: The ID of the speaker in this clip
- `speaker_gender`: The gender of the speaker (`m`/`f`)
- `speaker_name` (VoxCeleb1 only): The full name of the speaker in the clip
- `speaker_nationality` (VoxCeleb1 only): The speaker's country of origin
- `video_id`: The ID of the video from which this clip was taken
- `clip_index`: The index of the clip for this specific video
- `audio` (Audio dataset only): The audio signal data
### Data Splits
The dataset has a predefined dev set and test set. The dev set has been renamed to a "train" split.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
The dataset includes recordings of clips (mostly of celebrities and public figures) from public YouTube videos. The names of speakers in VoxCeleb1 are provided.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
The VoxCeleb authors request that anyone who uses VoxCeleb1 or VoxCeleb2 includes the following three citations:
```
@Article{Nagrani19,
author = "Arsha Nagrani and Joon~Son Chung and Weidi Xie and Andrew Zisserman",
title = "Voxceleb: Large-scale speaker verification in the wild",
journal = "Computer Science and Language",
year = "2019",
publisher = "Elsevier",
}
@InProceedings{Chung18b,
author = "Chung, J.~S. and Nagrani, A. and Zisserman, A.",
title = "VoxCeleb2: Deep Speaker Recognition",
booktitle = "INTERSPEECH",
year = "2018",
}
@InProceedings{Nagrani17,
author = "Nagrani, A. and Chung, J.~S. and Zisserman, A.",
title = "VoxCeleb: a large-scale speaker identification dataset",
booktitle = "INTERSPEECH",
year = "2017",
}
```
### Contributions
Thanks to [@101arrowz](https://github.com/101arrowz) for adding this dataset.
|
dknoar01/dknoar | ---
license: openrail
---
|
taesiri/fsmbench_transition_check | ---
dataset_info:
features:
- name: query_id
dtype: string
- name: fsm_id
dtype: string
- name: fsm_json
dtype: string
- name: difficulty_level
dtype: int64
- name: transition_matrix
dtype: string
- name: query
dtype: string
- name: answer
dtype: bool
splits:
- name: validation
num_bytes: 348667344
num_examples: 100000
download_size: 73222900
dataset_size: 348667344
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
|
AlekseyKorshuk/ultrachat_200k | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train_sft
num_bytes: 1391615034
num_examples: 207646
download_size: 730773530
dataset_size: 1391615034
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
---
|
thomasavare/waste-classification-audio-deepl2 | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: speaker
dtype: string
- name: transcription
dtype: string
- name: translation
dtype: string
- name: Class
dtype: string
- name: Class_index
dtype: float64
splits:
- name: train
num_bytes: 289237985.0
num_examples: 500
download_size: 289226215
dataset_size: 289237985.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
audio created from dataset "italian-dataset-deepl2" |
ikanher/dtd | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': banded
'1': blotchy
'2': braided
'3': bubbly
'4': bumpy
'5': chequered
'6': cobwebbed
'7': cracked
'8': crosshatched
'9': crystalline
'10': dotted
'11': fibrous
'12': flecked
'13': freckled
'14': frilly
'15': gauzy
'16': grid
'17': grooved
'18': honeycombed
'19': interlaced
'20': knitted
'21': lacelike
'22': lined
'23': marbled
'24': matted
'25': meshed
'26': paisley
'27': perforated
'28': pitted
'29': pleated
'30': polka-dotted
'31': porous
'32': potholed
'33': scaly
'34': smeared
'35': spiralled
'36': sprinkled
'37': stained
'38': stratified
'39': striped
'40': studded
'41': swirly
'42': veined
'43': waffled
'44': woven
'45': wrinkled
'46': zigzagged
splits:
- name: train
num_bytes: 717407652.0
num_examples: 1880
- name: test
num_bytes: 684789229.0
num_examples: 1880
- name: validation
num_bytes: 720930661.0
num_examples: 1880
download_size: 2123252036
dataset_size: 2123127542.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
anan-2024/twitter_dataset_1712991334 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 262833
num_examples: 716
download_size: 138606
dataset_size: 262833
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
psyche/instruction-gpt-3.5 | ---
dataset_info:
features:
- name: question
dtype: string
- name: gpt-3.5-turbo
dtype: string
splits:
- name: train
num_bytes: 6449418
num_examples: 5884
download_size: 3445248
dataset_size: 6449418
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "instruction-gpt-3.5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
WMGX/ai-tube-dailydoseofmemes | ---
license: cc-by-nc-sa-4.0
pretty_name: "Your Daily Dose of Memes"
---
## Description
I post memes every day, for YOUR entertainment!
## Model
SVD
## LoRA
veryVANYA/ps1-graphics-sdxl-v2
## Tags
- Memes
- Gaming
## Voice
Cloée
## Music
Upbeat video game music.
## Prompt
You will attempt to generate memes, such as cats doing silly things, funny deaths in video games, and anything that can be considered "funny, cute, adorable, hilarious," or any similar keywords. |
TrainingDataPro/parking-space-detection-dataset | ---
language:
- en
license: cc-by-nc-nd-4.0
task_categories:
- image-classification
- image-to-image
tags:
- code
dataset_info:
features:
- name: id
dtype: int32
- name: image
dtype: image
- name: mask
dtype: image
- name: bboxes
dtype: string
splits:
- name: train
num_bytes: 44610347
num_examples: 30
download_size: 44532683
dataset_size: 44610347
---
# Parking Space Detection & Classification Dataset
The dataset consists of images of parking spaces along with corresponding bounding box masks. In order to facilitate object detection and localization, every parking space in the images is annotated with a bounding box mask.
The bounding box mask outlines the boundary of the parking space, marking its position and shape within the image. This allows for accurate identification and extraction of individual parking spaces. Each parking spot is also labeled in accordance to its occupancy: **free, not free or partially free**.
This dataset can be leveraged for a range of applications such as *parking lot management, autonomous vehicle navigation, smart city implementations, and traffic analysis*.

# Get the dataset
### This is just an example of the data
Leave a request on [**https://trainingdata.pro/data-market**](https://trainingdata.pro/data-market/parking-spaces-detection?utm_source=huggingface&utm_medium=cpc&utm_campaign=parking-space-detection-dataset) to discuss your requirements, learn about the price and buy the dataset.
# Dataset structure
- **images** - contains of original images of parkings
- **boxes** - includes bounding box labeling for the original images
- **annotations.xml** - contains coordinates of the bounding boxes and labels, created for the original photo
# Data Format
Each image from `images` folder is accompanied by an XML-annotation in the `annotations.xml` file indicating the coordinates of the bounding boxes and labels for parking spaces. For each point, the x and y coordinates are provided.
### Labels for the parking space:
- **free_parking_space** - corresponds to free parking spaces, the box is **blue**
- **not_free_parking_space** - corresponds to occupied parking spaces, the box is **red**
- **partially_free_parking_space** - corresponds to partially free parking spaces, the box is **yellow**
# Example of XML file structure

# Parking Space Detection & Classification might be made in accordance with your requirements.
## [**TrainingData**](https://trainingdata.pro/data-market/parking-spaces-detection?utm_source=huggingface&utm_medium=cpc&utm_campaign=parking-space-detection-dataset) provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **https://www.kaggle.com/trainingdatapro/datasets**
TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets** |
Mitsuki-Sakamoto/alpaca_farm-deberta-re-preference-64-nsample-2_iso_filter_gold_thr_0.0_self_160m | ---
dataset_info:
config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: index
dtype: int64
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_12
num_bytes: 44183647
num_examples: 18929
- name: epoch_13
num_bytes: 44183352
num_examples: 18929
- name: epoch_14
num_bytes: 44182144
num_examples: 18929
- name: epoch_15
num_bytes: 44180910
num_examples: 18929
- name: epoch_16
num_bytes: 44180025
num_examples: 18929
- name: epoch_17
num_bytes: 44177405
num_examples: 18929
- name: epoch_18
num_bytes: 44178703
num_examples: 18929
- name: epoch_19
num_bytes: 44177940
num_examples: 18929
- name: epoch_20
num_bytes: 44177032
num_examples: 18929
- name: epoch_21
num_bytes: 44177421
num_examples: 18929
- name: epoch_22
num_bytes: 44176750
num_examples: 18929
- name: epoch_23
num_bytes: 44176502
num_examples: 18929
- name: epoch_24
num_bytes: 44175909
num_examples: 18929
- name: epoch_25
num_bytes: 44173244
num_examples: 18929
- name: epoch_26
num_bytes: 44173892
num_examples: 18929
- name: epoch_27
num_bytes: 44174193
num_examples: 18929
- name: epoch_28
num_bytes: 44174272
num_examples: 18929
- name: epoch_29
num_bytes: 44173395
num_examples: 18929
- name: epoch_0
num_bytes: 43548965
num_examples: 18929
- name: epoch_1
num_bytes: 44052631
num_examples: 18929
- name: epoch_2
num_bytes: 44070857
num_examples: 18929
- name: epoch_3
num_bytes: 44099166
num_examples: 18929
- name: epoch_4
num_bytes: 44114016
num_examples: 18929
- name: epoch_5
num_bytes: 44124275
num_examples: 18929
- name: epoch_6
num_bytes: 44133117
num_examples: 18929
- name: epoch_7
num_bytes: 44139483
num_examples: 18929
- name: epoch_8
num_bytes: 44137949
num_examples: 18929
- name: epoch_9
num_bytes: 44138814
num_examples: 18929
- name: epoch_10
num_bytes: 44136489
num_examples: 18929
- name: epoch_11
num_bytes: 44134458
num_examples: 18929
download_size: 4023079354
dataset_size: 1324026956
configs:
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
- split: epoch_10
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-*
- split: epoch_11
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-*
- split: epoch_12
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-*
- split: epoch_13
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-*
- split: epoch_14
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-*
- split: epoch_15
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-*
- split: epoch_16
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-*
- split: epoch_17
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-*
- split: epoch_18
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-*
- split: epoch_19
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-*
- split: epoch_20
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-*
- split: epoch_21
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-*
- split: epoch_22
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-*
- split: epoch_23
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-*
- split: epoch_24
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-*
- split: epoch_25
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-*
- split: epoch_26
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-*
- split: epoch_27
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-*
- split: epoch_28
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-*
- split: epoch_29
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-*
---
|
HasturOfficial/adgen | ---
dataset_info:
features:
- name: content
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 51127446
num_examples: 114599
- name: validation
num_bytes: 473784
num_examples: 1070
download_size: 27853861
dataset_size: 51601230
---
# Dataset Card for "adgen"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
thangvip/orca-processes | ---
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 31950704.438731268
num_examples: 32860
download_size: 11256640
dataset_size: 31950704.438731268
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "orca-processes"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
eson/cc100-samples | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- af
- am
- ar
- as
- az
- be
- bg
- bn
- br
- bs
- ca
- cs
- cy
- da
- de
- el
- en
- eo
- es
- et
- eu
- fa
- ff
- fi
- fr
- fy
- ga
- gd
- gl
- gn
- gu
- ha
- he
- hi
- hr
- ht
- hu
- hy
- id
- ig
- is
- it
- ja
- jv
- ka
- kk
- km
- kn
- ko
- ku
- ky
- la
- lg
- li
- ln
- lo
- lt
- lv
- mg
- mk
- ml
- mn
- mr
- ms
- my
- ne
- nl
- 'no'
- ns
- om
- or
- pa
- pl
- ps
- pt
- qu
- rm
- ro
- ru
- sa
- sc
- sd
- si
- sk
- sl
- so
- sq
- sr
- ss
- su
- sv
- sw
- ta
- te
- th
- tl
- tn
- tr
- ug
- uk
- ur
- uz
- vi
- wo
- xh
- yi
- yo
- zh
- zu
language_bcp47:
- bn-Latn
- hi-Latn
- my-x-zawgyi
- ta-Latn
- te-Latn
- ur-Latn
- zh-Hans
- zh-Hant
license:
- unknown
multilinguality:
- multilingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
paperswithcode_id: cc100
pretty_name: CC100
configs:
- config_name: am
data_files:
- split: train
path: data/am.txt
- config_name: ar
data_files:
- split: train
path: data/ar.txt
- config_name: as
data_files:
- split: train
path: data/as.txt
- config_name: az
data_files:
- split: train
path: data/az.txt
- config_name: be
data_files:
- split: train
path: data/be.txt
- config_name: bg
data_files:
- split: train
path: data/bg.txt
- config_name: bn
data_files:
- split: train
path: data/bn.txt
- config_name: bn_rom
data_files:
- split: train
path: data/bn_rom.txt
- config_name: br
data_files:
- split: train
path: data/br.txt
- config_name: bs
data_files:
- split: train
path: data/bs.txt
- config_name: ca
data_files:
- split: train
path: data/ca.txt
- config_name: cs
data_files:
- split: train
path: data/cs.txt
- config_name: cy
data_files:
- split: train
path: data/cy.txt
- config_name: da
data_files:
- split: train
path: data/da.txt
- config_name: de
data_files:
- split: train
path: data/de.txt
- config_name: el
data_files:
- split: train
path: data/el.txt
- config_name: en
data_files:
- split: train
path: data/en.txt
- config_name: eo
data_files:
- split: train
path: data/eo.txt
- config_name: es
data_files:
- split: train
path: data/es.txt
- config_name: et
data_files:
- split: train
path: data/et.txt
- config_name: eu
data_files:
- split: train
path: data/eu.txt
- config_name: fa
data_files:
- split: train
path: data/fa.txt
- config_name: ff
data_files:
- split: train
path: data/ff.txt
- config_name: fi
data_files:
- split: train
path: data/fi.txt
- config_name: fr
data_files:
- split: train
path: data/fr.txt
- config_name: fy
data_files:
- split: train
path: data/fy.txt
- config_name: ga
data_files:
- split: train
path: data/ga.txt
- config_name: gd
data_files:
- split: train
path: data/gd.txt
- config_name: gl
data_files:
- split: train
path: data/gl.txt
- config_name: gn
data_files:
- split: train
path: data/gn.txt
- config_name: gu
data_files:
- split: train
path: data/gu.txt
- config_name: ha
data_files:
- split: train
path: data/ha.txt
- config_name: he
data_files:
- split: train
path: data/he.txt
- config_name: hi
data_files:
- split: train
path: data/hi.txt
- config_name: hi_rom
data_files:
- split: train
path: data/hi_rom.txt
- config_name: hr
data_files:
- split: train
path: data/hr.txt
- config_name: ht
data_files:
- split: train
path: data/ht.txt
- config_name: hu
data_files:
- split: train
path: data/hu.txt
- config_name: hy
data_files:
- split: train
path: data/hy.txt
- config_name: id
data_files:
- split: train
path: data/id.txt
- config_name: ig
data_files:
- split: train
path: data/ig.txt
- config_name: is
data_files:
- split: train
path: data/is.txt
- config_name: it
data_files:
- split: train
path: data/it.txt
- config_name: ja
data_files:
- split: train
path: data/ja.txt
- config_name: jv
data_files:
- split: train
path: data/jv.txt
- config_name: ka
data_files:
- split: train
path: data/ka.txt
- config_name: kk
data_files:
- split: train
path: data/kk.txt
- config_name: km
data_files:
- split: train
path: data/km.txt
- config_name: kn
data_files:
- split: train
path: data/kn.txt
- config_name: ko
data_files:
- split: train
path: data/ko.txt
- config_name: ku
data_files:
- split: train
path: data/ku.txt
- config_name: ky
data_files:
- split: train
path: data/ky.txt
- config_name: la
data_files:
- split: train
path: data/la.txt
- config_name: lg
data_files:
- split: train
path: data/lg.txt
- config_name: li
data_files:
- split: train
path: data/li.txt
- config_name: ln
data_files:
- split: train
path: data/ln.txt
- config_name: lo
data_files:
- split: train
path: data/lo.txt
- config_name: lt
data_files:
- split: train
path: data/lt.txt
- config_name: lv
data_files:
- split: train
path: data/lv.txt
- config_name: mg
data_files:
- split: train
path: data/mg.txt
- config_name: mk
data_files:
- split: train
path: data/mk.txt
- config_name: ml
data_files:
- split: train
path: data/ml.txt
- config_name: mn
data_files:
- split: train
path: data/mn.txt
- config_name: mr
data_files:
- split: train
path: data/mr.txt
- config_name: ms
data_files:
- split: train
path: data/ms.txt
- config_name: my
data_files:
- split: train
path: data/my.txt
- config_name: my_zaw
data_files:
- split: train
path: data/my_zaw.txt
- config_name: ne
data_files:
- split: train
path: data/ne.txt
- config_name: nl
data_files:
- split: train
path: data/nl.txt
- config_name: 'no'
data_files:
- split: train
path: data/no.txt
- config_name: ns
data_files:
- split: train
path: data/ns.txt
- config_name: om
data_files:
- split: train
path: data/om.txt
- config_name: or
data_files:
- split: train
path: data/or.txt
- config_name: pa
data_files:
- split: train
path: data/pa.txt
- config_name: pl
data_files:
- split: train
path: data/pl.txt
- config_name: ps
data_files:
- split: train
path: data/ps.txt
- config_name: pt
data_files:
- split: train
path: data/pt.txt
- config_name: qu
data_files:
- split: train
path: data/qu.txt
- config_name: rm
data_files:
- split: train
path: data/rm.txt
- config_name: ro
data_files:
- split: train
path: data/ro.txt
- config_name: ru
data_files:
- split: train
path: data/ru.txt
- config_name: sa
data_files:
- split: train
path: data/sa.txt
- config_name: si
data_files:
- split: train
path: data/si.txt
- config_name: sc
data_files:
- split: train
path: data/sc.txt
- config_name: sd
data_files:
- split: train
path: data/sd.txt
- config_name: sk
data_files:
- split: train
path: data/sk.txt
- config_name: sl
data_files:
- split: train
path: data/sl.txt
- config_name: so
data_files:
- split: train
path: data/so.txt
- config_name: sq
data_files:
- split: train
path: data/sq.txt
- config_name: sr
data_files:
- split: train
path: data/sr.txt
- config_name: ss
data_files:
- split: train
path: data/ss.txt
- config_name: su
data_files:
- split: train
path: data/su.txt
- config_name: sv
data_files:
- split: train
path: data/sv.txt
- config_name: sw
data_files:
- split: train
path: data/sw.txt
- config_name: ta
data_files:
- split: train
path: data/ta.txt
- config_name: ta_rom
data_files:
- split: train
path: data/ta_rom.txt
- config_name: te
data_files:
- split: train
path: data/te.txt
- config_name: te_rom
data_files:
- split: train
path: data/te_rom.txt
- config_name: th
data_files:
- split: train
path: data/th.txt
- config_name: tl
data_files:
- split: train
path: data/tl.txt
- config_name: tn
data_files:
- split: train
path: data/tn.txt
- config_name: tr
data_files:
- split: train
path: data/tr.txt
- config_name: ug
data_files:
- split: train
path: data/ug.txt
- config_name: uk
data_files:
- split: train
path: data/uk.txt
- config_name: ur
data_files:
- split: train
path: data/ur.txt
- config_name: ur_rom
data_files:
- split: train
path: data/ur_rom.txt
- config_name: uz
data_files:
- split: train
path: data/uz.txt
- config_name: vi
data_files:
- split: train
path: data/vi.txt
- config_name: wo
data_files:
- split: train
path: data/wo.txt
- config_name: xh
data_files:
- split: train
path: data/xh.txt
- config_name: yi
data_files:
- split: train
path: data/yi.txt
- config_name: yo
data_files:
- split: train
path: data/yo.txt
- config_name: zh-Hans
data_files:
- split: train
path: data/zh-Hans.txt
- config_name: zh-Hant
data_files:
- split: train
path: data/zh-Hant.txt
- config_name: zu
data_files:
- split: train
path: data/zu.txt
---
The cc100-samples is a subset which contains first 10,000 lines of [cc100](https://data.statmt.org/cc-100/).
### Languages
To load a language which isn't part of the config, all you need to do is specify the language code in the config.
You can find the valid languages in Homepage section of Dataset Description: https://data.statmt.org/cc-100/
E.g.
`dataset = load_dataset("cc100-samples", lang="en")`
```py
VALID_CODES = [
"am", "ar", "as", "az", "be", "bg", "bn", "bn_rom", "br", "bs", "ca", "cs", "cy", "da", "de",
"el", "en", "eo", "es", "et", "eu", "fa", "ff", "fi", "fr", "fy", "ga", "gd", "gl", "gn", "gu",
"ha", "he", "hi", "hi_rom", "hr", "ht", "hu", "hy", "id", "ig", "is", "it", "ja", "jv", "ka",
"kk", "km", "kn", "ko", "ku", "ky", "la", "lg", "li", "ln", "lo", "lt", "lv", "mg", "mk", "ml",
"mn", "mr", "ms", "my", "my_zaw", "ne", "nl", "no", "ns", "om", "or", "pa", "pl", "ps", "pt",
"qu", "rm", "ro", "ru", "sa", "si", "sc", "sd", "sk", "sl", "so", "sq", "sr", "ss", "su", "sv",
"sw", "ta", "ta_rom", "te", "te_rom", "th", "tl", "tn", "tr", "ug", "uk", "ur", "ur_rom", "uz",
"vi", "wo", "xh", "yi", "yo", "zh-Hans", "zh-Hant", "zu",
]
```
## Dataset Structure
### Data Instances
An example from the `am` configuration:
```
{'id': '0', 'text': 'ተለዋዋጭ የግድግዳ አንግል ሙቅ አንቀሳቅሷል ቲ-አሞሌ አጥቅሼ ...\n'}
```
Each data point is a paragraph of text. The paragraphs are presented in the original (unshuffled) order. Documents are separated by a data point consisting of a single newline character.
### Data Fields
The data fields are:
- id: id of the example
- text: content as a string
|
sorbhet/adkrity | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_Doctor-Shotgun__CalliopeDS-L2-13B | ---
pretty_name: Evaluation run of Doctor-Shotgun/CalliopeDS-L2-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Doctor-Shotgun/CalliopeDS-L2-13B](https://huggingface.co/Doctor-Shotgun/CalliopeDS-L2-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Doctor-Shotgun__CalliopeDS-L2-13B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-26T04:36:21.549191](https://huggingface.co/datasets/open-llm-leaderboard/details_Doctor-Shotgun__CalliopeDS-L2-13B/blob/main/results_2023-10-26T04-36-21.549191.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.02307046979865772,\n\
\ \"em_stderr\": 0.0015374446489046481,\n \"f1\": 0.08979446308724821,\n\
\ \"f1_stderr\": 0.0020360011017500185,\n \"acc\": 0.4351997070321265,\n\
\ \"acc_stderr\": 0.010043960065261932\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.02307046979865772,\n \"em_stderr\": 0.0015374446489046481,\n\
\ \"f1\": 0.08979446308724821,\n \"f1_stderr\": 0.0020360011017500185\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10007581501137225,\n \
\ \"acc_stderr\": 0.008266274528685632\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7703235990528808,\n \"acc_stderr\": 0.011821645601838232\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Doctor-Shotgun/CalliopeDS-L2-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|arc:challenge|25_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_26T04_36_21.549191
path:
- '**/details_harness|drop|3_2023-10-26T04-36-21.549191.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-26T04-36-21.549191.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_26T04_36_21.549191
path:
- '**/details_harness|gsm8k|5_2023-10-26T04-36-21.549191.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-26T04-36-21.549191.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hellaswag|10_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-00-51.912601.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T14-00-51.912601.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T14-00-51.912601.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_26T04_36_21.549191
path:
- '**/details_harness|winogrande|5_2023-10-26T04-36-21.549191.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-26T04-36-21.549191.parquet'
- config_name: results
data_files:
- split: 2023_09_18T14_00_51.912601
path:
- results_2023-09-18T14-00-51.912601.parquet
- split: 2023_10_26T04_36_21.549191
path:
- results_2023-10-26T04-36-21.549191.parquet
- split: latest
path:
- results_2023-10-26T04-36-21.549191.parquet
---
# Dataset Card for Evaluation run of Doctor-Shotgun/CalliopeDS-L2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Doctor-Shotgun/CalliopeDS-L2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Doctor-Shotgun/CalliopeDS-L2-13B](https://huggingface.co/Doctor-Shotgun/CalliopeDS-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Doctor-Shotgun__CalliopeDS-L2-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-26T04:36:21.549191](https://huggingface.co/datasets/open-llm-leaderboard/details_Doctor-Shotgun__CalliopeDS-L2-13B/blob/main/results_2023-10-26T04-36-21.549191.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.02307046979865772,
"em_stderr": 0.0015374446489046481,
"f1": 0.08979446308724821,
"f1_stderr": 0.0020360011017500185,
"acc": 0.4351997070321265,
"acc_stderr": 0.010043960065261932
},
"harness|drop|3": {
"em": 0.02307046979865772,
"em_stderr": 0.0015374446489046481,
"f1": 0.08979446308724821,
"f1_stderr": 0.0020360011017500185
},
"harness|gsm8k|5": {
"acc": 0.10007581501137225,
"acc_stderr": 0.008266274528685632
},
"harness|winogrande|5": {
"acc": 0.7703235990528808,
"acc_stderr": 0.011821645601838232
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
AdityaNG/BengaluruEmbeddings | ---
license: mit
---
|
Inventureoo7/Chatbotdata | ---
license: unknown
---
|
CyberHarem/shenhe_genshin | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of shenhe/申鶴/申鹤 (Genshin Impact)
This is the dataset of shenhe/申鶴/申鹤 (Genshin Impact), containing 500 images and their tags.
The core tags of this character are `long_hair, breasts, blue_eyes, hair_over_one_eye, large_breasts, very_long_hair, hair_ornament, braid, grey_hair, tassel, white_hair, braided_ponytail, earrings, tassel_earrings`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:---------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 1.29 GiB | [Download](https://huggingface.co/datasets/CyberHarem/shenhe_genshin/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 1.03 GiB | [Download](https://huggingface.co/datasets/CyberHarem/shenhe_genshin/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1394 | 2.03 GiB | [Download](https://huggingface.co/datasets/CyberHarem/shenhe_genshin/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/shenhe_genshin',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, black_bodysuit, black_gloves, breast_curtain, covered_navel, hip_vent, looking_at_viewer, shoulder_cutout, simple_background, solo, white_background, cowboy_shot, jewelry, partially_fingerless_gloves, blush, parted_lips |
| 1 | 8 |  |  |  |  |  | 1girl, black_bodysuit, breast_curtain, hip_vent, jewelry, looking_at_viewer, shoulder_cutout, solo, black_gloves, covered_navel, partially_fingerless_gloves, parted_lips |
| 2 | 56 |  |  |  |  |  | 1girl, breast_curtain, hip_vent, solo, black_bodysuit, shoulder_cutout, black_gloves, partially_fingerless_gloves, looking_at_viewer, holding_polearm, covered_navel, jewelry, closed_mouth, parted_lips |
| 3 | 29 |  |  |  |  |  | 1girl, bare_shoulders, solo, looking_at_viewer, sleeveless_dress, black_dress, thighs, parted_lips, detached_sleeves, jewelry, china_dress, official_alternate_costume, blush, cleavage |
| 4 | 5 |  |  |  |  |  | 1girl, alternate_costume, bare_shoulders, jewelry, smile, solo, wedding_dress, white_dress, white_gloves, bridal_veil, bride, elbow_gloves, looking_at_viewer, blue_hair, cleavage, full_body, hair_flower, holding_bouquet, simple_background, white_background, white_flower, closed_mouth, long_dress, petals, rose, skirt_hold, sleeveless, standing |
| 5 | 5 |  |  |  |  |  | 1girl, blush, cleavage, looking_at_viewer, solo, bare_shoulders, lingerie, thighs, arm_up, armpits, black_bra, black_gloves, collarbone, navel, parted_lips, black_panties, black_thighhighs, bridal_gauntlets, elbow_gloves, holding, jewelry, on_back, sitting, stomach, underwear_only |
| 6 | 16 |  |  |  |  |  | 1girl, looking_at_viewer, solo, alternate_costume, white_shirt, black_skirt, office_lady, pencil_skirt, collared_shirt, blush, cleavage, contemporary, long_sleeves, thighs, jewelry, black_pantyhose, bra, holding, indoors, sitting, smile |
| 7 | 11 |  |  |  |  |  | 1girl, looking_at_viewer, solo, bare_shoulders, outdoors, thighs, blue_sky, cleavage, day, jewelry, navel, stomach, water, alternate_costume, wet, cloud, collarbone, cowboy_shot, beach, black_bikini, blush, closed_mouth, halterneck, side-tie_bikini_bottom, thigh_strap, white_bikini, bare_arms, choker, single_braid, string_bikini |
| 8 | 13 |  |  |  |  |  | 1girl, solo, alternate_costume, jewelry, looking_at_viewer, bare_shoulders, long_sleeves, open_jacket, midriff, sleeveless_shirt, white_shirt, black_jacket, crop_top, navel, off_shoulder, white_background, black_pants, closed_mouth, parted_lips, simple_background, sitting |
| 9 | 5 |  |  |  |  |  | 1girl, solo, black_dress, closed_mouth, looking_at_viewer, blush, grey_eyes, maid_apron, white_apron, cleavage, clothing_cutout, enmaided, frilled_apron, long_sleeves, maid_headdress, puffy_sleeves, simple_background, sitting, smile, thighhighs, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_bodysuit | black_gloves | breast_curtain | covered_navel | hip_vent | looking_at_viewer | shoulder_cutout | simple_background | solo | white_background | cowboy_shot | jewelry | partially_fingerless_gloves | blush | parted_lips | holding_polearm | closed_mouth | bare_shoulders | sleeveless_dress | black_dress | thighs | detached_sleeves | china_dress | official_alternate_costume | cleavage | alternate_costume | smile | wedding_dress | white_dress | white_gloves | bridal_veil | bride | elbow_gloves | blue_hair | full_body | hair_flower | holding_bouquet | white_flower | long_dress | petals | rose | skirt_hold | sleeveless | standing | lingerie | arm_up | armpits | black_bra | collarbone | navel | black_panties | black_thighhighs | bridal_gauntlets | holding | on_back | sitting | stomach | underwear_only | white_shirt | black_skirt | office_lady | pencil_skirt | collared_shirt | contemporary | long_sleeves | black_pantyhose | bra | indoors | outdoors | blue_sky | day | water | wet | cloud | beach | black_bikini | halterneck | side-tie_bikini_bottom | thigh_strap | white_bikini | bare_arms | choker | single_braid | string_bikini | open_jacket | midriff | sleeveless_shirt | black_jacket | crop_top | off_shoulder | black_pants | grey_eyes | maid_apron | white_apron | clothing_cutout | enmaided | frilled_apron | maid_headdress | puffy_sleeves | thighhighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:---------------|:-----------------|:----------------|:-----------|:--------------------|:------------------|:--------------------|:-------|:-------------------|:--------------|:----------|:------------------------------|:--------|:--------------|:------------------|:---------------|:-----------------|:-------------------|:--------------|:---------|:-------------------|:--------------|:-----------------------------|:-----------|:--------------------|:--------|:----------------|:--------------|:---------------|:--------------|:--------|:---------------|:------------|:------------|:--------------|:------------------|:---------------|:-------------|:---------|:-------|:-------------|:-------------|:-----------|:-----------|:---------|:----------|:------------|:-------------|:--------|:----------------|:-------------------|:-------------------|:----------|:----------|:----------|:----------|:-----------------|:--------------|:--------------|:--------------|:---------------|:-----------------|:---------------|:---------------|:------------------|:------|:----------|:-----------|:-----------|:------|:--------|:------|:--------|:--------|:---------------|:-------------|:-------------------------|:--------------|:---------------|:------------|:---------|:---------------|:----------------|:--------------|:----------|:-------------------|:---------------|:-----------|:---------------|:--------------|:------------|:-------------|:--------------|:------------------|:-----------|:----------------|:-----------------|:----------------|:-------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | | X | | | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 56 |  |  |  |  |  | X | X | X | X | X | X | X | X | | X | | | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 29 |  |  |  |  |  | X | | | | | | X | | | X | | | X | | X | X | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | | | | | X | | X | X | X | | X | | | | | X | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | X | | | | X | | | X | | | X | | X | X | | | X | | | X | | | | X | | | | | | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 16 |  |  |  |  |  | X | | | | | | X | | | X | | | X | | X | | | | | | | X | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 11 |  |  |  |  |  | X | | | | | | X | | | X | | X | X | | X | | | X | X | | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 8 | 13 |  |  |  |  |  | X | | | | | | X | | X | X | X | | X | | | X | | X | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | | | | | | X | | X | X | X | | | | X | | | X | | | X | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X |
|
albertvillanova/bad-request | ---
dataset_info:
features:
- name: x
dtype: int64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 48
num_examples: 3
download_size: 950
dataset_size: 48
---
# Dataset Card for "test-16722377061524"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_cola_preposition_chopping | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 880
num_examples: 12
- name: test
num_bytes: 758
num_examples: 10
- name: train
num_bytes: 6267
num_examples: 77
download_size: 10049
dataset_size: 7905
---
# Dataset Card for "MULTI_VALUE_cola_preposition_chopping"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-one-sec-cv12/chunk_116 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1390421520
num_examples: 273060
download_size: 1417200673
dataset_size: 1390421520
---
# Dataset Card for "chunk_116"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Myashka/SO-Python_basics_QA-filtered-2023-T5_paraphrased-tanh_score | ---
license: mit
---
|
dbruner23/davids-mini-platypus | ---
license: cc
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4139593
num_examples: 1000
download_size: 2237721
dataset_size: 4139593
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tiagoblima/tedtalk2012-punctuation-binary | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 383989
num_examples: 2442
download_size: 151918
dataset_size: 383989
---
# Dataset Card for "nilc-punctuation-binary"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Idrizorg/WER_Evaluation_For_TTS | ---
task_categories:
- text-to-speech
language:
- en
pretty_name: SOMOS
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_migtissera__Synthia-70B-v1.2b | ---
pretty_name: Evaluation run of migtissera/Synthia-70B-v1.2b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [migtissera/Synthia-70B-v1.2b](https://huggingface.co/migtissera/Synthia-70B-v1.2b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__Synthia-70B-v1.2b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T18:54:59.551883](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-70B-v1.2b/blob/main/results_2023-10-24T18-54-59.551883.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.44190436241610737,\n\
\ \"em_stderr\": 0.00508578632439048,\n \"f1\": 0.5040551593959751,\n\
\ \"f1_stderr\": 0.00484284160320387,\n \"acc\": 0.5957647712115981,\n\
\ \"acc_stderr\": 0.011744811294358018\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.44190436241610737,\n \"em_stderr\": 0.00508578632439048,\n\
\ \"f1\": 0.5040551593959751,\n \"f1_stderr\": 0.00484284160320387\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3525398028809704,\n \
\ \"acc_stderr\": 0.013159909755930321\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8389897395422258,\n \"acc_stderr\": 0.010329712832785717\n\
\ }\n}\n```"
repo_url: https://huggingface.co/migtissera/Synthia-70B-v1.2b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|arc:challenge|25_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T18_54_59.551883
path:
- '**/details_harness|drop|3_2023-10-24T18-54-59.551883.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T18-54-59.551883.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T18_54_59.551883
path:
- '**/details_harness|gsm8k|5_2023-10-24T18-54-59.551883.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T18-54-59.551883.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hellaswag|10_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T14-25-34.731307.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T14-25-34.731307.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T14-25-34.731307.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T18_54_59.551883
path:
- '**/details_harness|winogrande|5_2023-10-24T18-54-59.551883.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T18-54-59.551883.parquet'
- config_name: results
data_files:
- split: 2023_09_13T14_25_34.731307
path:
- results_2023-09-13T14-25-34.731307.parquet
- split: 2023_10_24T18_54_59.551883
path:
- results_2023-10-24T18-54-59.551883.parquet
- split: latest
path:
- results_2023-10-24T18-54-59.551883.parquet
---
# Dataset Card for Evaluation run of migtissera/Synthia-70B-v1.2b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/migtissera/Synthia-70B-v1.2b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [migtissera/Synthia-70B-v1.2b](https://huggingface.co/migtissera/Synthia-70B-v1.2b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_migtissera__Synthia-70B-v1.2b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T18:54:59.551883](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-70B-v1.2b/blob/main/results_2023-10-24T18-54-59.551883.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.44190436241610737,
"em_stderr": 0.00508578632439048,
"f1": 0.5040551593959751,
"f1_stderr": 0.00484284160320387,
"acc": 0.5957647712115981,
"acc_stderr": 0.011744811294358018
},
"harness|drop|3": {
"em": 0.44190436241610737,
"em_stderr": 0.00508578632439048,
"f1": 0.5040551593959751,
"f1_stderr": 0.00484284160320387
},
"harness|gsm8k|5": {
"acc": 0.3525398028809704,
"acc_stderr": 0.013159909755930321
},
"harness|winogrande|5": {
"acc": 0.8389897395422258,
"acc_stderr": 0.010329712832785717
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
polinaeterna/amazon_apparel_copy | ---
dataset_info:
features:
- name: marketplace
dtype: string
- name: customer_id
dtype: string
- name: review_id
dtype: string
- name: product_id
dtype: string
- name: product_parent
dtype: string
- name: product_title
dtype: string
- name: product_category
dtype: string
- name: star_rating
dtype: int32
- name: helpful_votes
dtype: int32
- name: total_votes
dtype: int32
- name: vine
dtype:
class_label:
names:
'0': 'N'
'1': 'Y'
- name: verified_purchase
dtype:
class_label:
names:
'0': 'N'
'1': 'Y'
- name: review_headline
dtype: string
- name: review_body
dtype: string
- name: review_date
dtype: string
splits:
- name: train
num_bytes: 2254343574
num_examples: 5906333
download_size: 1027207588
dataset_size: 2254343574
duplicated_from: polinaeterna/amazon_apparel
---
# Dataset Card for "amazon_apparel"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lerobot/aloha | ---
dataset_info:
config_name: mobile_cabinet
features:
- name: qpos
sequence:
sequence: float32
- name: qvel
sequence:
sequence: float32
- name: action
sequence:
sequence: float32
splits:
- name: train
num_bytes: 540024
num_examples: 2
download_size: 181984
dataset_size: 540024
configs:
- config_name: mobile_cabinet
data_files:
- split: train
path: mobile_cabinet/train-*
---
|
naorm/website-screenshots-blip-large | ---
dataset_info:
features:
- name: image
dtype: image
- name: index
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 151928472.776
num_examples: 1688
- name: validation
num_bytes: 44126471.0
num_examples: 484
- name: test
num_bytes: 22288179.0
num_examples: 242
download_size: 56770334
dataset_size: 218343122.776
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
BitTranslate/chatgpt-prompts-Georgian | ---
license: cc0-1.0
language:
- ka
tags:
- ChatGPT
--- |
Capsekai/Uracon | ---
license: creativeml-openrail-m
task_categories:
- text-classification
language:
- en
tags:
- art
size_categories:
- 1K<n<10K
---
The animation was independently produced by Shinji Aramaki and his manga club during their time at Okayama University. The animation premiered at the URACON III sci-fi convention in 1984.
More information can be found on MyAnimeList https://myanimelist.net/anime/42390/Uracon_III_Opening_Animation
More caps can be found on our youtube https://capsekai.tumblr.com/ |
skrishna/SeqSense_2 | ---
dataset_info:
features:
- name: input
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 16891
num_examples: 300
download_size: 4717
dataset_size: 16891
---
# Dataset Card for "SeqSense_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cleitindograu432/dataset | ---
license: openrail
---
|
autoevaluate/autoeval-staging-eval-project-5480d71b-7995081 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- cifar10
eval_info:
task: image_multi_class_classification
model: aaraki/vit-base-patch16-224-in21k-finetuned-cifar10
metrics: []
dataset_name: cifar10
dataset_config: plain_text
dataset_split: test
col_mapping:
image: img
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Image Classification
* Model: aaraki/vit-base-patch16-224-in21k-finetuned-cifar10
* Dataset: cifar10
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
tolgadev/autotrain-data-rottentomato | ---
task_categories:
- text-classification
---
# AutoTrain Dataset for project: rottentomato
## Dataset Description
This dataset has been automatically processed by AutoTrain for project rottentomato.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "too much of storytelling moves away from solondz's social critique , casting its audience as that of intellectual lector in contemplation of the auteur's professional injuries .",
"target": 1
},
{
"text": "what the audience feels is exhaustion , from watching a movie that is dark ( dark green , to be exact ) , sour , bloody and mean .",
"target": 0
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"target": "ClassLabel(names=['neg', 'pos'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 852 |
| valid | 214 |
|
misshimichka/flower_dataset | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: edit_prompt
dtype: string
- name: cartoonized_image
dtype: image
splits:
- name: train
num_bytes: 42915139.0
num_examples: 50
download_size: 42916734
dataset_size: 42915139.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MonoHime/ru_sentiment_dataset | ---
language:
- ru
tags:
- sentiment
- text-classification
---
# Dataset with sentiment of Russian text
Contains aggregated dataset of Russian texts from 6 datasets.
## Labels meaning
0: NEUTRAL
1: POSITIVE
2: NEGATIVE
## Datasets
**[Sentiment Analysis in Russian](https://www.kaggle.com/c/sentiment-analysis-in-russian/data)**
> Sentiments (positive, negative or neutral) of news in russian language from Kaggle competition.
**[Russian Language Toxic Comments](https://www.kaggle.com/blackmoon/russian-language-toxic-comments/)**
> Small dataset with labeled comments from 2ch.hk and pikabu.ru.
**[Dataset of car reviews for machine learning (sentiment analysis)](https://github.com/oldaandozerskaya/auto_reviews)**
> Glazkova A. The evaluation of the proximity of text categories for solving electronic documents classification tasks //VESTNIK TOMSKOGO GOSUDARSTVENNOGO UNIVERSITETA-UPRAVLENIE VYCHISLITELNAJA TEHNIKA I INFORMATIKA-TOMSK STATE UNIVERSITY JOURNAL OF CONTROL AND COMPUTER SCIENCE. – 2015. – Т. 31. – №. 2. – С. 18-25.
**[Sentiment datasets by Blinov](https://github.com/natasha/corus/issues/14)**
> Datasets contain reviews from different scopes.
**[LINIS Crowd](http://www.linis-crowd.org/)**
> Произведение «LINIS Crowd SENT - тональный словарь и коллекция текстов с тональной разметкой» созданное автором по имени Sergei Koltcov, Olessia Koltsova и Svetlana Alexeeva.
**[Russian Hotel Reviews Dataset](https://drive.google.com/drive/folders/17sa3h4XHcG0MJGrbfOsbL-kDW29CuJul)**
> Hotel reviews in Russian |
open-llm-leaderboard/details_Locutusque__Hyperion-3.0-Yi-34B | ---
pretty_name: Evaluation run of Locutusque/Hyperion-3.0-Yi-34B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Locutusque/Hyperion-3.0-Yi-34B](https://huggingface.co/Locutusque/Hyperion-3.0-Yi-34B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Locutusque__Hyperion-3.0-Yi-34B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-21T17:00:46.629310](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__Hyperion-3.0-Yi-34B/blob/main/results_2024-03-21T17-00-46.629310.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7544550658014192,\n\
\ \"acc_stderr\": 0.02829751963964748,\n \"acc_norm\": 0.7594893210634434,\n\
\ \"acc_norm_stderr\": 0.02882703593176765,\n \"mc1\": 0.4039167686658507,\n\
\ \"mc1_stderr\": 0.01717727682258428,\n \"mc2\": 0.5637641591908886,\n\
\ \"mc2_stderr\": 0.014721062007579585\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6151877133105802,\n \"acc_stderr\": 0.014218371065251102,\n\
\ \"acc_norm\": 0.6459044368600683,\n \"acc_norm_stderr\": 0.013975454122756558\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6533559051981677,\n\
\ \"acc_stderr\": 0.004749286071559571,\n \"acc_norm\": 0.8561043616809401,\n\
\ \"acc_norm_stderr\": 0.0035026656741971468\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7481481481481481,\n\
\ \"acc_stderr\": 0.03749850709174021,\n \"acc_norm\": 0.7481481481481481,\n\
\ \"acc_norm_stderr\": 0.03749850709174021\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8947368421052632,\n \"acc_stderr\": 0.024974533450920697,\n\
\ \"acc_norm\": 0.8947368421052632,\n \"acc_norm_stderr\": 0.024974533450920697\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8075471698113208,\n \"acc_stderr\": 0.024262979839372274,\n\
\ \"acc_norm\": 0.8075471698113208,\n \"acc_norm_stderr\": 0.024262979839372274\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8819444444444444,\n\
\ \"acc_stderr\": 0.026983346503309358,\n \"acc_norm\": 0.8819444444444444,\n\
\ \"acc_norm_stderr\": 0.026983346503309358\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.05021167315686779,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.05021167315686779\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7341040462427746,\n\
\ \"acc_stderr\": 0.033687629322594316,\n \"acc_norm\": 0.7341040462427746,\n\
\ \"acc_norm_stderr\": 0.033687629322594316\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.49019607843137253,\n \"acc_stderr\": 0.04974229460422817,\n\
\ \"acc_norm\": 0.49019607843137253,\n \"acc_norm_stderr\": 0.04974229460422817\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n\
\ \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7574468085106383,\n \"acc_stderr\": 0.028020226271200217,\n\
\ \"acc_norm\": 0.7574468085106383,\n \"acc_norm_stderr\": 0.028020226271200217\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5614035087719298,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.5614035087719298,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.8206896551724138,\n \"acc_stderr\": 0.03196766433373187,\n\
\ \"acc_norm\": 0.8206896551724138,\n \"acc_norm_stderr\": 0.03196766433373187\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.656084656084656,\n \"acc_stderr\": 0.024464426625596437,\n \"\
acc_norm\": 0.656084656084656,\n \"acc_norm_stderr\": 0.024464426625596437\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5793650793650794,\n\
\ \"acc_stderr\": 0.04415438226743745,\n \"acc_norm\": 0.5793650793650794,\n\
\ \"acc_norm_stderr\": 0.04415438226743745\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8838709677419355,\n\
\ \"acc_stderr\": 0.018225757949432306,\n \"acc_norm\": 0.8838709677419355,\n\
\ \"acc_norm_stderr\": 0.018225757949432306\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.625615763546798,\n \"acc_stderr\": 0.03405155380561952,\n\
\ \"acc_norm\": 0.625615763546798,\n \"acc_norm_stderr\": 0.03405155380561952\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\"\
: 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.027045948825865394,\n\
\ \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.027045948825865394\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8888888888888888,\n \"acc_stderr\": 0.02239078763821677,\n \"\
acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.02239078763821677\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527043,\n\
\ \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527043\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7923076923076923,\n \"acc_stderr\": 0.020567539567246787,\n\
\ \"acc_norm\": 0.7923076923076923,\n \"acc_norm_stderr\": 0.020567539567246787\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.4185185185185185,\n \"acc_stderr\": 0.030078013075022055,\n \
\ \"acc_norm\": 0.4185185185185185,\n \"acc_norm_stderr\": 0.030078013075022055\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8487394957983193,\n \"acc_stderr\": 0.023274255898707952,\n\
\ \"acc_norm\": 0.8487394957983193,\n \"acc_norm_stderr\": 0.023274255898707952\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5099337748344371,\n \"acc_stderr\": 0.04081677107248437,\n \"\
acc_norm\": 0.5099337748344371,\n \"acc_norm_stderr\": 0.04081677107248437\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9192660550458716,\n \"acc_stderr\": 0.011680172292862086,\n \"\
acc_norm\": 0.9192660550458716,\n \"acc_norm_stderr\": 0.011680172292862086\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6574074074074074,\n \"acc_stderr\": 0.032365852526021574,\n \"\
acc_norm\": 0.6574074074074074,\n \"acc_norm_stderr\": 0.032365852526021574\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813905,\n \"\
acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813905\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9156118143459916,\n \"acc_stderr\": 0.018094247116473314,\n \
\ \"acc_norm\": 0.9156118143459916,\n \"acc_norm_stderr\": 0.018094247116473314\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n\
\ \"acc_stderr\": 0.02693611191280227,\n \"acc_norm\": 0.7982062780269058,\n\
\ \"acc_norm_stderr\": 0.02693611191280227\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8549618320610687,\n \"acc_stderr\": 0.030884661089515375,\n\
\ \"acc_norm\": 0.8549618320610687,\n \"acc_norm_stderr\": 0.030884661089515375\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.9173553719008265,\n \"acc_stderr\": 0.025135382356604227,\n \"\
acc_norm\": 0.9173553719008265,\n \"acc_norm_stderr\": 0.025135382356604227\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.03038159675665167,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.03038159675665167\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8834355828220859,\n \"acc_stderr\": 0.025212327210507108,\n\
\ \"acc_norm\": 0.8834355828220859,\n \"acc_norm_stderr\": 0.025212327210507108\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5982142857142857,\n\
\ \"acc_stderr\": 0.04653333146973647,\n \"acc_norm\": 0.5982142857142857,\n\
\ \"acc_norm_stderr\": 0.04653333146973647\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.9029126213592233,\n \"acc_stderr\": 0.02931596291881348,\n\
\ \"acc_norm\": 0.9029126213592233,\n \"acc_norm_stderr\": 0.02931596291881348\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9273504273504274,\n\
\ \"acc_stderr\": 0.017004368568132342,\n \"acc_norm\": 0.9273504273504274,\n\
\ \"acc_norm_stderr\": 0.017004368568132342\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8978288633461047,\n\
\ \"acc_stderr\": 0.01083072471313418,\n \"acc_norm\": 0.8978288633461047,\n\
\ \"acc_norm_stderr\": 0.01083072471313418\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8294797687861272,\n \"acc_stderr\": 0.020247961569303728,\n\
\ \"acc_norm\": 0.8294797687861272,\n \"acc_norm_stderr\": 0.020247961569303728\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5843575418994413,\n\
\ \"acc_stderr\": 0.016482782187500673,\n \"acc_norm\": 0.5843575418994413,\n\
\ \"acc_norm_stderr\": 0.016482782187500673\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8627450980392157,\n \"acc_stderr\": 0.01970403918385981,\n\
\ \"acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.01970403918385981\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8295819935691319,\n\
\ \"acc_stderr\": 0.02135534302826405,\n \"acc_norm\": 0.8295819935691319,\n\
\ \"acc_norm_stderr\": 0.02135534302826405\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8734567901234568,\n \"acc_stderr\": 0.018498600558790913,\n\
\ \"acc_norm\": 0.8734567901234568,\n \"acc_norm_stderr\": 0.018498600558790913\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.648936170212766,\n \"acc_stderr\": 0.02847350127296376,\n \
\ \"acc_norm\": 0.648936170212766,\n \"acc_norm_stderr\": 0.02847350127296376\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6010430247718384,\n\
\ \"acc_stderr\": 0.01250675765529368,\n \"acc_norm\": 0.6010430247718384,\n\
\ \"acc_norm_stderr\": 0.01250675765529368\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8051470588235294,\n \"acc_stderr\": 0.024060599423487424,\n\
\ \"acc_norm\": 0.8051470588235294,\n \"acc_norm_stderr\": 0.024060599423487424\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8202614379084967,\n \"acc_stderr\": 0.01553374508338279,\n \
\ \"acc_norm\": 0.8202614379084967,\n \"acc_norm_stderr\": 0.01553374508338279\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.043502714429232425,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.043502714429232425\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8530612244897959,\n \"acc_stderr\": 0.022665400417217638,\n\
\ \"acc_norm\": 0.8530612244897959,\n \"acc_norm_stderr\": 0.022665400417217638\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n\
\ \"acc_stderr\": 0.022509345325101696,\n \"acc_norm\": 0.8855721393034826,\n\
\ \"acc_norm_stderr\": 0.022509345325101696\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \
\ \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n\
\ \"acc_stderr\": 0.038444531817709175,\n \"acc_norm\": 0.5783132530120482,\n\
\ \"acc_norm_stderr\": 0.038444531817709175\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.026640582539133196,\n\
\ \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.026640582539133196\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4039167686658507,\n\
\ \"mc1_stderr\": 0.01717727682258428,\n \"mc2\": 0.5637641591908886,\n\
\ \"mc2_stderr\": 0.014721062007579585\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.835043409629045,\n \"acc_stderr\": 0.010430917468237424\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6103108415466262,\n \
\ \"acc_stderr\": 0.013433123236110706\n }\n}\n```"
repo_url: https://huggingface.co/Locutusque/Hyperion-3.0-Yi-34B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|arc:challenge|25_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|gsm8k|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hellaswag|10_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T17-00-46.629310.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T17-00-46.629310.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- '**/details_harness|winogrande|5_2024-03-21T17-00-46.629310.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-21T17-00-46.629310.parquet'
- config_name: results
data_files:
- split: 2024_03_21T17_00_46.629310
path:
- results_2024-03-21T17-00-46.629310.parquet
- split: latest
path:
- results_2024-03-21T17-00-46.629310.parquet
---
# Dataset Card for Evaluation run of Locutusque/Hyperion-3.0-Yi-34B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Locutusque/Hyperion-3.0-Yi-34B](https://huggingface.co/Locutusque/Hyperion-3.0-Yi-34B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Locutusque__Hyperion-3.0-Yi-34B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-21T17:00:46.629310](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__Hyperion-3.0-Yi-34B/blob/main/results_2024-03-21T17-00-46.629310.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7544550658014192,
"acc_stderr": 0.02829751963964748,
"acc_norm": 0.7594893210634434,
"acc_norm_stderr": 0.02882703593176765,
"mc1": 0.4039167686658507,
"mc1_stderr": 0.01717727682258428,
"mc2": 0.5637641591908886,
"mc2_stderr": 0.014721062007579585
},
"harness|arc:challenge|25": {
"acc": 0.6151877133105802,
"acc_stderr": 0.014218371065251102,
"acc_norm": 0.6459044368600683,
"acc_norm_stderr": 0.013975454122756558
},
"harness|hellaswag|10": {
"acc": 0.6533559051981677,
"acc_stderr": 0.004749286071559571,
"acc_norm": 0.8561043616809401,
"acc_norm_stderr": 0.0035026656741971468
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7481481481481481,
"acc_stderr": 0.03749850709174021,
"acc_norm": 0.7481481481481481,
"acc_norm_stderr": 0.03749850709174021
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8947368421052632,
"acc_stderr": 0.024974533450920697,
"acc_norm": 0.8947368421052632,
"acc_norm_stderr": 0.024974533450920697
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8075471698113208,
"acc_stderr": 0.024262979839372274,
"acc_norm": 0.8075471698113208,
"acc_norm_stderr": 0.024262979839372274
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8819444444444444,
"acc_stderr": 0.026983346503309358,
"acc_norm": 0.8819444444444444,
"acc_norm_stderr": 0.026983346503309358
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.48,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.48,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.033687629322594316,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.033687629322594316
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.49019607843137253,
"acc_stderr": 0.04974229460422817,
"acc_norm": 0.49019607843137253,
"acc_norm_stderr": 0.04974229460422817
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7574468085106383,
"acc_stderr": 0.028020226271200217,
"acc_norm": 0.7574468085106383,
"acc_norm_stderr": 0.028020226271200217
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5614035087719298,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.5614035087719298,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.8206896551724138,
"acc_stderr": 0.03196766433373187,
"acc_norm": 0.8206896551724138,
"acc_norm_stderr": 0.03196766433373187
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.656084656084656,
"acc_stderr": 0.024464426625596437,
"acc_norm": 0.656084656084656,
"acc_norm_stderr": 0.024464426625596437
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5793650793650794,
"acc_stderr": 0.04415438226743745,
"acc_norm": 0.5793650793650794,
"acc_norm_stderr": 0.04415438226743745
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8838709677419355,
"acc_stderr": 0.018225757949432306,
"acc_norm": 0.8838709677419355,
"acc_norm_stderr": 0.018225757949432306
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.625615763546798,
"acc_stderr": 0.03405155380561952,
"acc_norm": 0.625615763546798,
"acc_norm_stderr": 0.03405155380561952
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8606060606060606,
"acc_stderr": 0.027045948825865394,
"acc_norm": 0.8606060606060606,
"acc_norm_stderr": 0.027045948825865394
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.02239078763821677,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.02239078763821677
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9689119170984456,
"acc_stderr": 0.012525310625527043,
"acc_norm": 0.9689119170984456,
"acc_norm_stderr": 0.012525310625527043
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7923076923076923,
"acc_stderr": 0.020567539567246787,
"acc_norm": 0.7923076923076923,
"acc_norm_stderr": 0.020567539567246787
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4185185185185185,
"acc_stderr": 0.030078013075022055,
"acc_norm": 0.4185185185185185,
"acc_norm_stderr": 0.030078013075022055
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8487394957983193,
"acc_stderr": 0.023274255898707952,
"acc_norm": 0.8487394957983193,
"acc_norm_stderr": 0.023274255898707952
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5099337748344371,
"acc_stderr": 0.04081677107248437,
"acc_norm": 0.5099337748344371,
"acc_norm_stderr": 0.04081677107248437
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9192660550458716,
"acc_stderr": 0.011680172292862086,
"acc_norm": 0.9192660550458716,
"acc_norm_stderr": 0.011680172292862086
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.032365852526021574,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.032365852526021574
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.019398452135813905,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.019398452135813905
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9156118143459916,
"acc_stderr": 0.018094247116473314,
"acc_norm": 0.9156118143459916,
"acc_norm_stderr": 0.018094247116473314
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.02693611191280227,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.02693611191280227
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8549618320610687,
"acc_stderr": 0.030884661089515375,
"acc_norm": 0.8549618320610687,
"acc_norm_stderr": 0.030884661089515375
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9173553719008265,
"acc_stderr": 0.025135382356604227,
"acc_norm": 0.9173553719008265,
"acc_norm_stderr": 0.025135382356604227
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.03038159675665167,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.03038159675665167
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8834355828220859,
"acc_stderr": 0.025212327210507108,
"acc_norm": 0.8834355828220859,
"acc_norm_stderr": 0.025212327210507108
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5982142857142857,
"acc_stderr": 0.04653333146973647,
"acc_norm": 0.5982142857142857,
"acc_norm_stderr": 0.04653333146973647
},
"harness|hendrycksTest-management|5": {
"acc": 0.9029126213592233,
"acc_stderr": 0.02931596291881348,
"acc_norm": 0.9029126213592233,
"acc_norm_stderr": 0.02931596291881348
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9273504273504274,
"acc_stderr": 0.017004368568132342,
"acc_norm": 0.9273504273504274,
"acc_norm_stderr": 0.017004368568132342
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8978288633461047,
"acc_stderr": 0.01083072471313418,
"acc_norm": 0.8978288633461047,
"acc_norm_stderr": 0.01083072471313418
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8294797687861272,
"acc_stderr": 0.020247961569303728,
"acc_norm": 0.8294797687861272,
"acc_norm_stderr": 0.020247961569303728
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5843575418994413,
"acc_stderr": 0.016482782187500673,
"acc_norm": 0.5843575418994413,
"acc_norm_stderr": 0.016482782187500673
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.01970403918385981,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.01970403918385981
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8295819935691319,
"acc_stderr": 0.02135534302826405,
"acc_norm": 0.8295819935691319,
"acc_norm_stderr": 0.02135534302826405
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8734567901234568,
"acc_stderr": 0.018498600558790913,
"acc_norm": 0.8734567901234568,
"acc_norm_stderr": 0.018498600558790913
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.648936170212766,
"acc_stderr": 0.02847350127296376,
"acc_norm": 0.648936170212766,
"acc_norm_stderr": 0.02847350127296376
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6010430247718384,
"acc_stderr": 0.01250675765529368,
"acc_norm": 0.6010430247718384,
"acc_norm_stderr": 0.01250675765529368
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8051470588235294,
"acc_stderr": 0.024060599423487424,
"acc_norm": 0.8051470588235294,
"acc_norm_stderr": 0.024060599423487424
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8202614379084967,
"acc_stderr": 0.01553374508338279,
"acc_norm": 0.8202614379084967,
"acc_norm_stderr": 0.01553374508338279
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.043502714429232425,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.043502714429232425
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8530612244897959,
"acc_stderr": 0.022665400417217638,
"acc_norm": 0.8530612244897959,
"acc_norm_stderr": 0.022665400417217638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101696,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101696
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.038444531817709175,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.038444531817709175
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.026640582539133196,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.026640582539133196
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4039167686658507,
"mc1_stderr": 0.01717727682258428,
"mc2": 0.5637641591908886,
"mc2_stderr": 0.014721062007579585
},
"harness|winogrande|5": {
"acc": 0.835043409629045,
"acc_stderr": 0.010430917468237424
},
"harness|gsm8k|5": {
"acc": 0.6103108415466262,
"acc_stderr": 0.013433123236110706
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ostapeno/code_alpaca | ---
dataset_info:
features:
- name: dataset
dtype: string
- name: id
dtype: string
- name: messages
list:
- name: role
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 7830075
num_examples: 20022
download_size: 3538209
dataset_size: 7830075
---
# Dataset Card for "code_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
one-sec-cv12/chunk_252 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 16779297456.875
num_examples: 174697
download_size: 14937788290
dataset_size: 16779297456.875
---
# Dataset Card for "chunk_252"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pepoo20/Math_Elementary-HighSchool | ---
dataset_info:
- config_name: Deepening_given_equation
features:
- name: 'Unnamed: 0'
dtype: int64
- name: Text Responses
dtype: string
- name: Symbolic Form
dtype: string
- name: Symbolic Answers
dtype: string
splits:
- name: train
num_bytes: 1651
num_examples: 9
download_size: 4234
dataset_size: 1651
- config_name: TestDataset
features:
- name: Question
dtype: string
- name: Answer
dtype: string
- name: Source
dtype: string
- name: Type
dtype: string
- name: Word Problem
dtype: string
splits:
- name: train
num_bytes: 9003
num_examples: 60
download_size: 6765
dataset_size: 9003
- config_name: Variables_On_Both_Sides_GivenEquation
features:
- name: 'Unnamed: 0'
dtype: int64
- name: Text Responses
dtype: string
- name: Symbolic Form
dtype: string
- name: Symbolic Answers
dtype: string
splits:
- name: train
num_bytes: 2286
num_examples: 9
download_size: 5109
dataset_size: 2286
- config_name: WordProblem
features:
- name: 'Unnamed: 0'
dtype: int64
- name: Text Responses
dtype: string
- name: Symbolic Form
dtype: string
- name: Symbolic Answers
dtype: string
splits:
- name: train
num_bytes: 7724
num_examples: 9
download_size: 9654
dataset_size: 7724
- config_name: WordProblem_EquationGiven
features:
- name: 'Unnamed: 0'
dtype: int64
- name: Text Responses
dtype: string
- name: Symbolic Form
dtype: string
- name: Symbolic Answers
dtype: string
splits:
- name: train
num_bytes: 6842
num_examples: 9
download_size: 8973
dataset_size: 6842
- config_name: WordProblem_UnGivenEquation
features:
- name: 'Unnamed: 0'
dtype: int64
- name: Text Responses
dtype: string
- name: Symbolic Form
dtype: string
- name: Symbolic Answers
dtype: string
splits:
- name: train
num_bytes: 5416
num_examples: 9
download_size: 6320
dataset_size: 5416
configs:
- config_name: Deepening_given_equation
data_files:
- split: train
path: Deepening_given_equation/train-*
- config_name: TestDataset
data_files:
- split: train
path: TestDataset/train-*
- config_name: Variables_On_Both_Sides_GivenEquation
data_files:
- split: train
path: Variables_On_Both_Sides_GivenEquation/train-*
- config_name: WordProblem
data_files:
- split: train
path: WordProblem/train-*
- config_name: WordProblem_EquationGiven
data_files:
- split: train
path: WordProblem_EquationGiven/train-*
- config_name: WordProblem_UnGivenEquation
data_files:
- split: train
path: WordProblem_UnGivenEquation/train-*
---
|
nookbe/Handelsgesetzbuch_HGB | ---
license: mit
task_categories:
- text-classification
language:
- de
tags:
- legal
pretty_name: HGB
size_categories:
- 1K<n<10K
---
license: mit
task_categories:
- text-classification
language:
- de
tags:
- legal
pretty_name: HGB
size_categories:
- 1K<n<10K
---
# German HGB Law Dataset (Handelsgesetzbuch)
## Dataset Description
- **Date of Last Paragraph Update:** April 2023
- **Dataset Guarantee:** The dataset is provided "as is," and there is no guarantee for the correctness or completeness of the data.
### Dataset Summary
The HGB Law Dataset contains legal text from the German Commercial Code (Handelsgesetzbuch - HGB). It focuses on the general principles of German commercial law, and the dataset is designed for tasks related to legal text analysis.
## Dataset Structure
### Data Instances
A typical data point in the dataset comprises a legal paragraph and its corresponding text. For example:
```json
{
'paragraph': '§ 1 Handelsstand',
'text': 'Wer ein Handelsgewerbe betreibt, ist Kaufmann.'
}
``` |
qwedsacf/subnet6-evaluation | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: loss_0x0dad0/nous_nous_v7_5--None
dtype: float64
- name: loss_zzttbrdd/sn6_21--None
dtype: float64
- name: loss_zzttbrdd/sn6_10--None
dtype: float64
- name: loss_0x0dad0/nous_nous_v8_4--None
dtype: float64
splits:
- name: train
num_bytes: 1331777
num_examples: 400
download_size: 726645
dataset_size: 1331777
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jlbaker361/multiplication_decimal | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: float64
- name: text
dtype: string
splits:
- name: train
num_bytes: 2349837.9
num_examples: 29376
- name: test
num_bytes: 261093.1
num_examples: 3264
download_size: 1140671
dataset_size: 2610931.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "multiplication_decimal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/Hatefulmemes_test_google_flan_t5_xl_mode_C_HM_A_T_OCR_rices_ns_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
sequence: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0
num_bytes: 1169674
num_examples: 1000
download_size: 206401
dataset_size: 1169674
---
# Dataset Card for "Hatefulmemes_test_google_flan_t5_xl_mode_C_HM_A_T_OCR_rices_ns_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HuggingFaceM4/MMMU_OpenEnded | Invalid username or password. |
HuggingFaceM4/MMMU_MCQ_3 | Invalid username or password. |
sav7669/sroie_data_set | ---
license: openrail
---
|
julien-c/tweets | ---
license: other
---
# some of julien-c's tweets
Use this to power my personal chat agent(s) to chat and act on my behalf.
|
xontoloyoo/mymodel | ---
license: creativeml-openrail-m
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.