datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
laion/laion2B-multi-joined | Invalid username or password. |
CyberHarem/collei_genshin | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of collei/コレイ/柯莱 (Genshin Impact)
This is the dataset of collei/コレイ/柯莱 (Genshin Impact), containing 500 images and their tags.
The core tags of this character are `green_hair, purple_eyes, hair_ornament, hair_between_eyes, earrings, medium_hair, ahoge, long_hair, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 881.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/collei_genshin/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 748.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/collei_genshin/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1217 | 1.44 GiB | [Download](https://huggingface.co/datasets/CyberHarem/collei_genshin/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/collei_genshin',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, closed_mouth, green_capelet, long_sleeves, looking_at_viewer, solo, crossed_bangs, dress, smile, single_earring, upper_body, blush, bridal_gauntlets |
| 1 | 29 |  |  |  |  |  | 1girl, green_capelet, black_dress, solo, looking_at_viewer, brown_thighhighs, smile, blush, crossed_bangs, detached_sleeves, puffy_long_sleeves, single_earring, thighlet, bridal_gauntlets, simple_background, closed_mouth, white_background, open_mouth |
| 2 | 7 |  |  |  |  |  | 1girl, blush, green_capelet, solo, crossed_bangs, simple_background, upper_body, closed_mouth, looking_at_viewer, single_earring, white_background, holding |
| 3 | 20 |  |  |  |  |  | 1girl, looking_at_viewer, solo, green_capelet, black_dress, jewelry, holding_bow_(weapon), closed_mouth, detached_sleeves, puffy_long_sleeves, vision_(genshin_impact), brown_thighhighs, bridal_gauntlets, gloves |
| 4 | 10 |  |  |  |  |  | 1girl, black_panties, from_behind, looking_at_viewer, looking_back, solo, thighs, blush, green_capelet, ass_focus, cameltoe, detached_sleeves, long_sleeves, sweat, huge_ass, closed_mouth, crossed_bangs, single_earring, thong, black_thighhighs, partially_visible_vulva, sideboob, tassel, backless_outfit, brown_thighhighs, dress |
| 5 | 9 |  |  |  |  |  | 1boy, 1girl, hetero, nipples, open_mouth, penis, pussy, sex, spread_legs, vaginal, blush, medium_breasts, navel, nude, single_earring, uncensored, large_breasts, brown_thighhighs, erection, looking_at_viewer |
| 6 | 6 |  |  |  |  |  | 1girl, long_sleeves, eyewear_on_head, fingerless_gloves, holding_pen, solo, white_jacket, backpack, closed_mouth, notepad, open_clothes, outdoors, paper, shirt, black_pants, goggles_on_head, jewelry |
| 7 | 6 |  |  |  |  |  | 1girl, blush, navel, nipples, nude, pussy, spread_legs, single_earring, sitting, solo, anus, looking_at_viewer, female_masturbation, medium_breasts, mosaic_censoring, open_mouth, small_breasts |
| 8 | 15 |  |  |  |  |  | 1girl, alternate_costume, bare_shoulders, blush, crossed_bangs, jewelry, looking_at_viewer, sleeveless_dress, solo, halter_dress, large_breasts, thighs, smile, cleavage, closed_mouth, bare_arms, sweat, covered_navel, outdoors, clothing_cutout, medium_breasts, alternate_breast_size, armpit_crease, sideboob, sitting, very_long_hair, building |
| 9 | 5 |  |  |  |  |  | 1girl, blush, erection, futanari, large_breasts, large_penis, mosaic_censoring, navel, nipples, solo, stomach, testicles, thighs, alternate_breast_size, closed_mouth, cowboy_shot, crossed_bangs, female_pubic_hair, looking_at_viewer, single_earring, veiny_penis, armpits, arms_behind_head, arms_up, black_sleeves, choker, collarbone, completely_nude, detached_sleeves, huge_breasts, long_sleeves, open_mouth, outdoors, patreon_username, water, wet |
| 10 | 5 |  |  |  |  |  | 1girl, beach, blue_sky, blush, cloud, day, looking_at_viewer, navel, ocean, outdoors, solo, stomach, string_bikini, thighs, water, wet, collarbone, halterneck, side-tie_bikini_bottom, sitting, tassel, alternate_costume, black_bikini, black_jacket, crossed_bangs, green_capelet, hand_up, long_sleeves, medium_breasts, open_jacket, parted_lips, single_earring, smile, tree, arm_support, black_choker, cameltoe, cleavage, closed_mouth, cowboy_shot, green_bikini, knee_up, large_breasts, off_shoulder, sweat |
| 11 | 7 |  |  |  |  |  | 1girl, :d, ass, beach, cowboy_shot, crossed_bangs, jewelry, looking_at_viewer, looking_back, open_mouth, outdoors, side-tie_bikini_bottom, thighs, water, bare_shoulders, blue_sky, cloud, day, from_behind, halterneck, ocean, sideboob, solo, string_bikini, bare_arms, median_furrow, medium_breasts, green_bikini, large_breasts, standing, tree, wet, blush, teeth |
| 12 | 5 |  |  |  |  |  | 1girl, bare_shoulders, blush, cleavage, cowboy_shot, detached_collar, looking_at_viewer, parted_lips, playboy_bunny, rabbit_ears, solo, strapless_leotard, thighs, alternate_costume, brown_pantyhose, covered_navel, crossed_bangs, fake_animal_ears, highleg_leotard, large_breasts, outdoors, bare_arms, black_leotard, cameltoe, forest, grin, leaf, rabbit_tail, sweat, day, detached_sleeves, hair_flower, hand_up, jewelry, long_sleeves, thigh_gap, thighlet, tree, water, wet, wrist_cuffs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | closed_mouth | green_capelet | long_sleeves | looking_at_viewer | solo | crossed_bangs | dress | smile | single_earring | upper_body | blush | bridal_gauntlets | black_dress | brown_thighhighs | detached_sleeves | puffy_long_sleeves | thighlet | simple_background | white_background | open_mouth | holding | jewelry | holding_bow_(weapon) | vision_(genshin_impact) | gloves | black_panties | from_behind | looking_back | thighs | ass_focus | cameltoe | sweat | huge_ass | thong | black_thighhighs | partially_visible_vulva | sideboob | tassel | backless_outfit | 1boy | hetero | nipples | penis | pussy | sex | spread_legs | vaginal | medium_breasts | navel | nude | uncensored | large_breasts | erection | eyewear_on_head | fingerless_gloves | holding_pen | white_jacket | backpack | notepad | open_clothes | outdoors | paper | shirt | black_pants | goggles_on_head | sitting | anus | female_masturbation | mosaic_censoring | small_breasts | alternate_costume | bare_shoulders | sleeveless_dress | halter_dress | cleavage | bare_arms | covered_navel | clothing_cutout | alternate_breast_size | armpit_crease | very_long_hair | building | futanari | large_penis | stomach | testicles | cowboy_shot | female_pubic_hair | veiny_penis | armpits | arms_behind_head | arms_up | black_sleeves | choker | collarbone | completely_nude | huge_breasts | patreon_username | water | wet | beach | blue_sky | cloud | day | ocean | string_bikini | halterneck | side-tie_bikini_bottom | black_bikini | black_jacket | hand_up | open_jacket | parted_lips | tree | arm_support | black_choker | green_bikini | knee_up | off_shoulder | :d | ass | median_furrow | standing | teeth | detached_collar | playboy_bunny | rabbit_ears | strapless_leotard | brown_pantyhose | fake_animal_ears | highleg_leotard | black_leotard | forest | grin | leaf | rabbit_tail | hair_flower | thigh_gap | wrist_cuffs |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:---------------|:----------------|:---------------|:--------------------|:-------|:----------------|:--------|:--------|:-----------------|:-------------|:--------|:-------------------|:--------------|:-------------------|:-------------------|:---------------------|:-----------|:--------------------|:-------------------|:-------------|:----------|:----------|:-----------------------|:--------------------------|:---------|:----------------|:--------------|:---------------|:---------|:------------|:-----------|:--------|:-----------|:--------|:-------------------|:--------------------------|:-----------|:---------|:------------------|:-------|:---------|:----------|:--------|:--------|:------|:--------------|:----------|:-----------------|:--------|:-------|:-------------|:----------------|:-----------|:------------------|:--------------------|:--------------|:---------------|:-----------|:----------|:---------------|:-----------|:--------|:--------|:--------------|:------------------|:----------|:-------|:----------------------|:-------------------|:----------------|:--------------------|:-----------------|:-------------------|:---------------|:-----------|:------------|:----------------|:------------------|:------------------------|:----------------|:-----------------|:-----------|:-----------|:--------------|:----------|:------------|:--------------|:--------------------|:--------------|:----------|:-------------------|:----------|:----------------|:---------|:-------------|:------------------|:---------------|:-------------------|:--------|:------|:--------|:-----------|:--------|:------|:--------|:----------------|:-------------|:-------------------------|:---------------|:---------------|:----------|:--------------|:--------------|:-------|:--------------|:---------------|:---------------|:----------|:---------------|:-----|:------|:----------------|:-----------|:--------|:------------------|:----------------|:--------------|:--------------------|:------------------|:-------------------|:------------------|:----------------|:---------|:-------|:-------|:--------------|:--------------|:------------|:--------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 29 |  |  |  |  |  | X | X | X | | X | X | X | | X | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | | X | X | X | | | X | X | X | | | | | | | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 20 |  |  |  |  |  | X | X | X | | X | X | | | | | | | X | X | X | X | X | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | | X | | X | | | X | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 9 |  |  |  |  |  | X | | | | X | | | | | X | | X | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | X | | X | | X | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | | | | X | X | | | | X | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | | X | | X | | X | X | X | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 15 |  |  |  |  |  | X | X | | | X | X | X | | X | | | X | | | | | | | | | | | X | | | | | | | X | | | X | | | | | X | | | | | | | | | | | X | | | | X | | | | | | | | | X | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | X | | X | X | X | X | | | X | | X | | | | X | | | | | X | | | | | | | | | X | | | | | | | | | | | | | X | | | | | | | X | | | X | X | | | | | | | | X | | | | | | | | X | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | | X | X | | X | | | | | | | | | | | | | | | | | | X | | X | X | | | | | | X | | | | | | | | | | X | X | | | X | | | | | | | | | X | | | | | X | | | | | X | | | | X | | | | | | | | | | X | | X | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 11 | 7 |  |  |  |  |  | X | | | | X | X | X | | | | | X | | | | | | | | | X | | X | | | | | X | X | X | | | | | | | | X | | | | | | | | | | | X | | | | X | | | | | | | | | X | | | | | | | | | | | X | | | | X | | | | | | | | | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | X | | | X | | | X | X | X | X | X | | | | | | | | | | | | | | | |
| 12 | 5 |  |  |  |  |  | X | | | X | X | X | X | | | | | X | | | | X | | X | | | | | X | | | | | | | X | | X | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | X | | | | | | | | | | X | X | | | X | X | X | | | | | | | | | | X | | | | | | | | | | | | X | X | | | | X | | | | | | | X | | X | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_TFLai__OpenOrca-Platypus2-13B-QLoRA-0.80-epoch | ---
pretty_name: Evaluation run of TFLai/OpenOrca-Platypus2-13B-QLoRA-0.80-epoch
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TFLai/OpenOrca-Platypus2-13B-QLoRA-0.80-epoch](https://huggingface.co/TFLai/OpenOrca-Platypus2-13B-QLoRA-0.80-epoch)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TFLai__OpenOrca-Platypus2-13B-QLoRA-0.80-epoch\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-19T14:11:37.243975](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__OpenOrca-Platypus2-13B-QLoRA-0.80-epoch/blob/main/results_2023-10-19T14-11-37.243975.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.007969798657718121,\n\
\ \"em_stderr\": 0.0009105960734168444,\n \"f1\": 0.09576552013422834,\n\
\ \"f1_stderr\": 0.001953364199146174,\n \"acc\": 0.4345717050239562,\n\
\ \"acc_stderr\": 0.01035518693998461\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.007969798657718121,\n \"em_stderr\": 0.0009105960734168444,\n\
\ \"f1\": 0.09576552013422834,\n \"f1_stderr\": 0.001953364199146174\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11144806671721001,\n \
\ \"acc_stderr\": 0.008668021353794433\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7576953433307024,\n \"acc_stderr\": 0.012042352526174785\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TFLai/OpenOrca-Platypus2-13B-QLoRA-0.80-epoch
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|arc:challenge|25_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_19T14_11_37.243975
path:
- '**/details_harness|drop|3_2023-10-19T14-11-37.243975.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-19T14-11-37.243975.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_19T14_11_37.243975
path:
- '**/details_harness|gsm8k|5_2023-10-19T14-11-37.243975.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-19T14-11-37.243975.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hellaswag|10_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_19T14_11_37.243975
path:
- '**/details_harness|winogrande|5_2023-10-19T14-11-37.243975.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-19T14-11-37.243975.parquet'
- config_name: results
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- results_2023-08-28T22:50:32.447793.parquet
- split: 2023_10_19T14_11_37.243975
path:
- results_2023-10-19T14-11-37.243975.parquet
- split: latest
path:
- results_2023-10-19T14-11-37.243975.parquet
---
# Dataset Card for Evaluation run of TFLai/OpenOrca-Platypus2-13B-QLoRA-0.80-epoch
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TFLai/OpenOrca-Platypus2-13B-QLoRA-0.80-epoch
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TFLai/OpenOrca-Platypus2-13B-QLoRA-0.80-epoch](https://huggingface.co/TFLai/OpenOrca-Platypus2-13B-QLoRA-0.80-epoch) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TFLai__OpenOrca-Platypus2-13B-QLoRA-0.80-epoch",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-19T14:11:37.243975](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__OpenOrca-Platypus2-13B-QLoRA-0.80-epoch/blob/main/results_2023-10-19T14-11-37.243975.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.007969798657718121,
"em_stderr": 0.0009105960734168444,
"f1": 0.09576552013422834,
"f1_stderr": 0.001953364199146174,
"acc": 0.4345717050239562,
"acc_stderr": 0.01035518693998461
},
"harness|drop|3": {
"em": 0.007969798657718121,
"em_stderr": 0.0009105960734168444,
"f1": 0.09576552013422834,
"f1_stderr": 0.001953364199146174
},
"harness|gsm8k|5": {
"acc": 0.11144806671721001,
"acc_stderr": 0.008668021353794433
},
"harness|winogrande|5": {
"acc": 0.7576953433307024,
"acc_stderr": 0.012042352526174785
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
devChou/sugar_sumnet | ---
license: odbl
---
|
danielmalencar/danielmelo | ---
license: mit
---
|
tyzhu/find_last_sent_train_30_eval_10_hint10 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 91352
num_examples: 70
- name: validation
num_bytes: 11480
num_examples: 10
download_size: 67036
dataset_size: 102832
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "find_last_sent_train_30_eval_10_hint10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pouya-haghi/celeba-hq-1k | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': female
'1': male
splits:
- name: train
num_bytes: 94417238.56457143
num_examples: 1024
download_size: 94733658
dataset_size: 94417238.56457143
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hiyouga/glaive-function-calling-v2-sharegpt | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
tags:
- glaiveai
- llama-factory
size_categories:
- 100K<n<1M
pretty_name: Glaive Function Calling
---
The [glaive-function-calling-v2](https://huggingface.co/datasets/glaiveai/glaive-function-calling-v2) dataset in sharegpt format.
You can use it in [LLaMA Factory](https://github.com/hiyouga/LLaMA-Factory) by specifying `--dataset glaive_toolcall_100k`.
|
Saxo/ko_aspect_sentiment_sns_mall_sentiment_linkbricks_single_dataset_with_prompt_text_huggingface | ---
license: apache-2.0
---
|
joey234/mmlu-conceptual_physics-verbal-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 67425
num_examples: 235
download_size: 40085
dataset_size: 67425
---
# Dataset Card for "mmlu-conceptual_physics-verbal-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
adityarra07/ATC_train | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 2954591414.38965
num_examples: 22152
- name: test
num_bytes: 66689044.203450024
num_examples: 500
download_size: 0
dataset_size: 3021280458.5931
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "ATC_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nlplabtdtu/daily_dialog_gan | ---
language: en
dataset_info:
features:
- name: text
dtype: string
splits:
- name: step1
num_bytes: 4307176.197157762
num_examples: 6670
- name: step2
num_bytes: 1436155.901421119
num_examples: 2224
- name: step3
num_bytes: 1436155.901421119
num_examples: 2224
- name: val
num_bytes: 663829
num_examples: 1000
- name: test
num_bytes: 645567
num_examples: 1000
download_size: 658018
dataset_size: 8488884.0
---
# Dataset Card for "daily_dialog_gan"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bosbos/instruct_falcon_dataset | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 503456
num_examples: 1000
download_size: 241640
dataset_size: 503456
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
RajkNakka/github-issues-comments | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: labels
list:
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: id
dtype: int64
- name: name
dtype: string
- name: node_id
dtype: string
- name: url
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: assignees
list:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: milestone
struct:
- name: closed_at
dtype: string
- name: closed_issues
dtype: int64
- name: created_at
dtype: string
- name: creator
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: description
dtype: string
- name: due_on
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: labels_url
dtype: string
- name: node_id
dtype: string
- name: number
dtype: int64
- name: open_issues
dtype: int64
- name: state
dtype: string
- name: title
dtype: string
- name: updated_at
dtype: string
- name: url
dtype: string
- name: comments
sequence: string
- name: created_at
dtype: string
- name: updated_at
dtype: string
- name: closed_at
dtype: string
- name: author_association
dtype: string
- name: active_lock_reason
dtype: 'null'
- name: body
dtype: string
- name: reactions
struct:
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: confused
dtype: int64
- name: eyes
dtype: int64
- name: heart
dtype: int64
- name: hooray
dtype: int64
- name: laugh
dtype: int64
- name: rocket
dtype: int64
- name: total_count
dtype: int64
- name: url
dtype: string
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: 'null'
- name: state_reason
dtype: string
- name: draft
dtype: bool
- name: pull_request
struct:
- name: diff_url
dtype: string
- name: html_url
dtype: string
- name: merged_at
dtype: string
- name: patch_url
dtype: string
- name: url
dtype: string
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 23011032
num_examples: 4900
download_size: 6746060
dataset_size: 23011032
---
# Dataset Card for "github-issues-comments"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Hashif/guanaco-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966692
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
aahhopapa/SdRunPod_1.5 | ---
license: mit
---
|
aertit/bot_tr_2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: nb_tokens
dtype: int64
splits:
- name: train
num_bytes: 1224611.5903434544
num_examples: 1607
- name: test
num_bytes: 306343.40965654555
num_examples: 402
download_size: 826325
dataset_size: 1530955.0
---
# Dataset Card for "bot_tr_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jjiiaa/mj-prompts | ---
license: unknown
task_categories:
- text-classification
- text-generation
- token-classification
language:
- en
pretty_name: midjoury prompts
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:https://huggingface.co/datasets/jjiiaa/mj-prompts/
- **Repository:https://huggingface.co/datasets/jjiiaa/mj-prompts/
### Dataset Summary
adding soon
## Dataset Structure
adding soon
### Data Splits
adding soon
### Licensing Information
adding soon |
tmnam20/VietnameseBookCorpus-raw | ---
license: mit
---
|
open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_Cyber_5.0 | ---
pretty_name: Evaluation run of LeroyDyer/Mixtral_AI_Cyber_5.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [LeroyDyer/Mixtral_AI_Cyber_5.0](https://huggingface.co/LeroyDyer/Mixtral_AI_Cyber_5.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_Cyber_5.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-05T06:42:38.392993](https://huggingface.co/datasets/open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_Cyber_5.0/blob/main/results_2024-04-05T06-42-38.392993.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6319018824466575,\n\
\ \"acc_stderr\": 0.03243701667042675,\n \"acc_norm\": 0.6339164216412606,\n\
\ \"acc_norm_stderr\": 0.03309308870008847,\n \"mc1\": 0.40636474908200737,\n\
\ \"mc1_stderr\": 0.017193835812093893,\n \"mc2\": 0.5786475510628558,\n\
\ \"mc2_stderr\": 0.015303043316159103\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6143344709897611,\n \"acc_stderr\": 0.014224250973257182,\n\
\ \"acc_norm\": 0.6484641638225256,\n \"acc_norm_stderr\": 0.013952413699600935\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6544513045210117,\n\
\ \"acc_stderr\": 0.004745749538752324,\n \"acc_norm\": 0.8445528779127663,\n\
\ \"acc_norm_stderr\": 0.0036158989282692785\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145634,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145634\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.03724249595817731,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.03724249595817731\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.047551296160629475,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.047551296160629475\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.032555253593403555,\n\
\ \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.032555253593403555\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\
\ \"acc_stderr\": 0.04630653203366595,\n \"acc_norm\": 0.41228070175438597,\n\
\ \"acc_norm_stderr\": 0.04630653203366595\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43386243386243384,\n \"acc_stderr\": 0.02552503438247489,\n \"\
acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.02552503438247489\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7387096774193549,\n \"acc_stderr\": 0.024993053397764805,\n \"\
acc_norm\": 0.7387096774193549,\n \"acc_norm_stderr\": 0.024993053397764805\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n \"\
acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709443,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.024121125416941197,\n\
\ \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.024121125416941197\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524565,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524565\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507338,\n \"\
acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507338\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5694444444444444,\n \"acc_stderr\": 0.03376922151252335,\n \"\
acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.03376922151252335\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.0286265479124374,\n \"acc_norm\"\
: 0.7892156862745098,\n \"acc_norm_stderr\": 0.0286265479124374\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808507,\n \"\
acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808507\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098825,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098825\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n\
\ \"acc_stderr\": 0.013778693778464073,\n \"acc_norm\": 0.8186462324393359,\n\
\ \"acc_norm_stderr\": 0.013778693778464073\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7023121387283237,\n \"acc_stderr\": 0.024617055388677003,\n\
\ \"acc_norm\": 0.7023121387283237,\n \"acc_norm_stderr\": 0.024617055388677003\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40782122905027934,\n\
\ \"acc_stderr\": 0.016435865260914742,\n \"acc_norm\": 0.40782122905027934,\n\
\ \"acc_norm_stderr\": 0.016435865260914742\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.025917806117147158,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.025917806117147158\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n\
\ \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.6881028938906752,\n\
\ \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.025483115601195448,\n\
\ \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.025483115601195448\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43741851368970014,\n\
\ \"acc_stderr\": 0.012669813464935724,\n \"acc_norm\": 0.43741851368970014,\n\
\ \"acc_norm_stderr\": 0.012669813464935724\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.02909720956841195,\n\
\ \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.02909720956841195\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6323529411764706,\n \"acc_stderr\": 0.019506291693954854,\n \
\ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.019506291693954854\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065677,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.032659863237109066,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.032659863237109066\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40636474908200737,\n\
\ \"mc1_stderr\": 0.017193835812093893,\n \"mc2\": 0.5786475510628558,\n\
\ \"mc2_stderr\": 0.015303043316159103\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8066298342541437,\n \"acc_stderr\": 0.011099796645920522\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5625473843821076,\n \
\ \"acc_stderr\": 0.013664299060751917\n }\n}\n```"
repo_url: https://huggingface.co/LeroyDyer/Mixtral_AI_Cyber_5.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|arc:challenge|25_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|gsm8k|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hellaswag|10_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T06-42-38.392993.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-05T06-42-38.392993.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- '**/details_harness|winogrande|5_2024-04-05T06-42-38.392993.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-05T06-42-38.392993.parquet'
- config_name: results
data_files:
- split: 2024_04_05T06_42_38.392993
path:
- results_2024-04-05T06-42-38.392993.parquet
- split: latest
path:
- results_2024-04-05T06-42-38.392993.parquet
---
# Dataset Card for Evaluation run of LeroyDyer/Mixtral_AI_Cyber_5.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [LeroyDyer/Mixtral_AI_Cyber_5.0](https://huggingface.co/LeroyDyer/Mixtral_AI_Cyber_5.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_Cyber_5.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-05T06:42:38.392993](https://huggingface.co/datasets/open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_Cyber_5.0/blob/main/results_2024-04-05T06-42-38.392993.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6319018824466575,
"acc_stderr": 0.03243701667042675,
"acc_norm": 0.6339164216412606,
"acc_norm_stderr": 0.03309308870008847,
"mc1": 0.40636474908200737,
"mc1_stderr": 0.017193835812093893,
"mc2": 0.5786475510628558,
"mc2_stderr": 0.015303043316159103
},
"harness|arc:challenge|25": {
"acc": 0.6143344709897611,
"acc_stderr": 0.014224250973257182,
"acc_norm": 0.6484641638225256,
"acc_norm_stderr": 0.013952413699600935
},
"harness|hellaswag|10": {
"acc": 0.6544513045210117,
"acc_stderr": 0.004745749538752324,
"acc_norm": 0.8445528779127663,
"acc_norm_stderr": 0.0036158989282692785
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145634,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145634
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.03724249595817731,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.03724249595817731
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.047551296160629475,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.047551296160629475
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.032555253593403555,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.032555253593403555
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.04630653203366595,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.04630653203366595
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.02552503438247489,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.02552503438247489
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7387096774193549,
"acc_stderr": 0.024993053397764805,
"acc_norm": 0.7387096774193549,
"acc_norm_stderr": 0.024993053397764805
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709443,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.024121125416941197,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.024121125416941197
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524565,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524565
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.01599015488507338,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.01599015488507338
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.03376922151252335,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.03376922151252335
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.0286265479124374,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.0286265479124374
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808507,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808507
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098825,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098825
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.013778693778464073,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.013778693778464073
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7023121387283237,
"acc_stderr": 0.024617055388677003,
"acc_norm": 0.7023121387283237,
"acc_norm_stderr": 0.024617055388677003
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40782122905027934,
"acc_stderr": 0.016435865260914742,
"acc_norm": 0.40782122905027934,
"acc_norm_stderr": 0.016435865260914742
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.025917806117147158,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.025917806117147158
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.02631185807185416,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.02631185807185416
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.025483115601195448,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.025483115601195448
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43741851368970014,
"acc_stderr": 0.012669813464935724,
"acc_norm": 0.43741851368970014,
"acc_norm_stderr": 0.012669813464935724
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.02909720956841195,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.02909720956841195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.019506291693954854,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.019506291693954854
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065677,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.032659863237109066,
"acc_norm": 0.88,
"acc_norm_stderr": 0.032659863237109066
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40636474908200737,
"mc1_stderr": 0.017193835812093893,
"mc2": 0.5786475510628558,
"mc2_stderr": 0.015303043316159103
},
"harness|winogrande|5": {
"acc": 0.8066298342541437,
"acc_stderr": 0.011099796645920522
},
"harness|gsm8k|5": {
"acc": 0.5625473843821076,
"acc_stderr": 0.013664299060751917
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
vietgpt-archive/camel_vi | ---
dataset_info:
features:
- name: role_1
dtype: string
- name: role_2
dtype: string
- name: original_task
dtype: string
- name: specified_task
dtype: string
- name: messages
list:
- name: input
dtype: string
- name: instruction
dtype: string
- name: response
dtype: string
- name: role
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 171026076
num_examples: 10744
download_size: 52918251
dataset_size: 171026076
---
# Dataset Card for "camel_vi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
roleplay4fun/aesir-v1.0 | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: bot_name
dtype: string
- name: persona
dtype: string
- name: tags
sequence: string
- name: scenario
dtype: string
- name: demonstration
dtype: string
- name: first_message
dtype: string
splits:
- name: train
num_bytes: 10337644
num_examples: 1000
download_size: 5988374
dataset_size: 10337644
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AshanGimhana/Mydata | ---
license: mit
---
|
JINIAC/ja_wiki_20240301_filter | ---
license: cc-by-sa-4.0
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 549213864
num_examples: 206950
download_size: 336842660
dataset_size: 549213864
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Samibaral/MathNet_28K | ---
license: other
license_name: private-share
license_link: https://new.assistments.org/terms-and-conditions-assistments
---
|
phantuecs/testtesttesttestest | ---
license: mit
language:
- vi
- en
task_categories:
- audio-to-audio
tags:
- art
pretty_name: fds
size_categories:
- 1M<n<10M
--- |
trajesh/english_tanglish_rajini_dialogues | ---
dataset_info:
config_name: en-tg
features:
- name: id
dtype: int64
- name: translation
dtype: string
splits:
- name: train
num_bytes: 36727
num_examples: 247
download_size: 5018
dataset_size: 36727
configs:
- config_name: en-tg
data_files:
- split: train
path: en-tg/train-*
---
|
greathero/testdataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': one
'1': two
splits:
- name: train
num_bytes: 1279.0
num_examples: 1
download_size: 10295
dataset_size: 1279.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dhiruHF/DocQA-dataset-300-samples | ---
dataset_info:
features:
- name: input
dtype: string
splits:
- name: train
num_bytes: 1275055
num_examples: 300
download_size: 752355
dataset_size: 1275055
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "DocQA-dataset-300-samples"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tellarin-ai/ntx_llm_inst_italian | ---
license: cc-by-sa-4.0
language:
- it
task_categories:
- token-classification
---
# Dataset Card for NTX v1 in the Aya format - Italian subset
This dataset is a format conversion for the Italian data from the original NTX into the Aya instruction format and it's released here under the CC-BY-SA 4.0 license.
## Dataset Details
For the original NTX dataset, the conversion to the Aya instructions format, or more details, please refer to the full dataset in instruction form (https://huggingface.co/datasets/tellarin-ai/ntx_llm_instructions) or to the paper below.
**NOTE: ** Unfortunately, due to a conversion issue with numerical expressions, this version here only includes the temporal expressions part of NTX.
## Citation
If you utilize this dataset version, feel free to cite/footnote the complete version at https://huggingface.co/datasets/tellarin-ai/ntx_llm_instructions, but please also cite the *original dataset publication*.
**BibTeX:**
```
@preprint{chen2023dataset,
title={Dataset and Baseline System for Multi-lingual Extraction and Normalization of Temporal and Numerical Expressions},
author={Sanxing Chen and Yongqiang Chen and Börje F. Karlsson},
year={2023},
eprint={2303.18103},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
ares1123/vto_dress_train_data | ---
dataset_info:
features:
- name: image
dtype: image
- name: conditioning_image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 3926205095.0
num_examples: 11647
download_size: 3925251404
dataset_size: 3926205095.0
---
# Dataset Card for "vto_dress_train_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
one-sec-cv12/chunk_46 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 22148956944.625
num_examples: 230603
download_size: 19719920430
dataset_size: 22148956944.625
---
# Dataset Card for "chunk_46"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mychen76/color_terms_llama2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 19196851.91855284
num_examples: 27109
- name: test
num_bytes: 4799744.081447163
num_examples: 6778
- name: validation
num_bytes: 960232.070587541
num_examples: 1356
download_size: 4680227
dataset_size: 24956828.070587542
---
# Dataset Card for "color_terms_llama2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ghidav/safety-data | ---
dataset_info:
features:
- name: text
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 1461769623
num_examples: 458194
- name: test
num_bytes: 129475839
num_examples: 41596
download_size: 312584599
dataset_size: 1591245462
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "safety-data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AnonymousSubmissionOnly/Hybrid | ---
license: mit
---
|
jose-h-solorzano/synth-forgetting-generalization-8 | ---
dataset_info:
features:
- name: input
sequence: float64
- name: output
sequence: float64
splits:
- name: train
num_bytes: 16320000.0
num_examples: 40000
- name: test
num_bytes: 4080000.0
num_examples: 10000
download_size: 13810203
dataset_size: 20400000.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
AlekseyKorshuk/roleplay-characters | ---
dataset_info:
features:
- name: char_name
dtype: string
- name: char_persona
dtype: string
- name: world_scenario
dtype: string
- name: char_greeting
dtype: string
- name: example_dialogue
dtype: string
- name: name
dtype: string
- name: description
dtype: string
- name: personality
dtype: string
- name: scenario
dtype: string
- name: first_mes
dtype: string
- name: mes_example
dtype: string
- name: metadata
struct:
- name: created
dtype: int64
- name: modified
dtype: int64
- name: source
dtype: 'null'
- name: tool
struct:
- name: name
dtype: string
- name: url
dtype: string
- name: version
dtype: string
- name: version
dtype: int64
- name: image
dtype: image
splits:
- name: train
num_bytes: 474656700.0
num_examples: 784
download_size: 0
dataset_size: 474656700.0
---
# Dataset Card for "roleplay-characters"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jlbaker361/avatar-extra-lite_captioned-augmented | ---
dataset_info:
features:
- name: image
dtype: image
- name: src
dtype: string
- name: split
dtype: string
- name: id
dtype: int64
- name: caption
dtype: string
splits:
- name: train
num_bytes: 32361041.0
num_examples: 135
download_size: 32349531
dataset_size: 32361041.0
---
# Dataset Card for "avatar-extra-lite_captioned-augmented"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vinci-grape/repair | ---
license: other
---
|
imageomics/TreeOfLife-10M | ---
License: cc0-1.0
language:
- en
- la
pretty_name: TreeOfLife-10M
task_categories:
- image-classification
- zero-shot-classification
tags:
- biology
- images
- animals
- evolutionary biology
- CV
- multimodal
- clip
- biology
- species
- taxonomy
- knowledge-guided
- imbalanced
size_categories: 10M<n<100M
---
# Dataset Card for TreeOfLife-10M
## Dataset Description
<!-- - **Homepage:** -->
- **Repository:** [Imageomics/bioclip](https://github.com/Imageomics/bioclip)
- **Paper:** BioCLIP: A Vision Foundation Model for the Tree of Life ([arXiv](https://doi.org/10.48550/arXiv.2311.18803))
<!-- - **Leaderboard:** -->
### Dataset Summary
With over 10 million images covering 454 thousand taxa in the tree of life, TreeOfLife-10M is the largest-to-date ML-ready dataset of images of biological organisms paired with their associated taxonomic labels. It expands on the foundation established by existing high-quality datasets, such as iNat21 and BIOSCAN-1M, by further incorporating newly curated images from the Encyclopedia of Life (eol.org), which supplies most of TreeOfLife-10M’s data diversity. Every image in TreeOfLife-10M is labeled to the most specific taxonomic level possible, as well as higher taxonomic ranks in the tree of life (see [Text Types](#text-types) for examples of taxonomic ranks and labels). TreeOfLife-10M was generated for the purpose of training [BioCLIP](https://huggingface.co/imageomics/bioclip) and future biology foundation models.
<!--This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1). And further altered to suit Imageomics Institute needs. -->
||
|:--|
|**Figure 1.** Treemap from phyla down to family for TreeOfLife-10M. Interactive version available in [`visuals`](https://huggingface.co/datasets/imageomics/TreeOfLife-10M/tree/main/visuals) folder.|
### Supported Tasks and Leaderboards
Image Classification, Zero-shot and few-shot Classification.
### Languages
English, Latin
## Dataset Contents
```
/dataset/
EOL/
image_set_01.tar.gz
image_set_02.tar.gz
...
image_set_63.tar.gz
metadata/
catalog.csv
species_level_taxonomy_chains.csv
taxon.tab
licenses.csv
visuals/
kingodm_ToL_tree.html
kingdom_ToL_tree.pdf
phyla_ToL_tree.html
phyla_ToL_tree.pdf
phyla_ToL_tree.png
```
Each `image_set` is approximately 30GB and contains 100 thousand images, each named `<treeoflife_id>.jpg`.
We cannot reproduce the `iNat21` data, but after downloading it and BIOSCAN-1M, one can follow the directions from step 6 of [docs/imageomics/treeoflife10m.md](https://github.com/Imageomics/bioclip/blob/main/docs/imageomics/treeoflife10m.md) in the BioCLIP GitHub repo to combine them with the EOL data into the proper webdataset structure. This will produce a collection of files named `shard-######.tar` in a `train`, `val`, and `train_small` folder with which to work.
Inside each shard is a collection of images (named `<treeoflife_id>.jpg`), for which each has the following files:
```
<treeoflife_id>.com.txt
<treeoflife_id>.common_name.txt
<treeoflife_id>.jpg
<treeoflife_id>.sci.txt
<treeoflife_id>.sci_com.txt
<treeoflife_id>.scientific_name.txt
<treeoflife_id>.taxon.txt
<treeoflife_id>.taxonTag.txt
<treeoflife_id>.taxonTag_com.txt
<treeoflife_id>.taxon_com.txt
<treeoflife_id>.taxonomic_name.txt
```
### Data Instances
This dataset is a collection of images with associated text. The text matched to images contains both [Linnaean taxonomy](https://www.britannica.com/science/taxonomy/The-objectives-of-biological-classification) (kingdom through species) for the particular subject of the image and its common (or vernacular) name where available. There are 8,455,243 images with full taxonomic labels.
### Data Fields
#### Metadata Files
`catalog.csv`: contains the following metadata associated with each image in the dataset
- `split`: indicates which data split the image belongs to (`train`, `val`, or `train_small`), `train_small` is a duplicated subset of `train` and thus should not be included when analyzing overall stats of the dataset.
- `treeoflife_id`: unique identifier for the image in the dataset.
- `eol_content_id`: unique identifier within EOL database for images sourced from [EOL](https://eol.org). Note that EOL content IDs are not stable.
- `eol_page_id`: identifier of page from which images from EOL are sourced. Note that an image's association to a particular page ID may change with updates to the EOL (or image provider's) hierarchy. However, EOL taxon page IDs are stable.
- `bioscan_part`: indicates to which of the 113 data chunks of [BIOSCAN-1M](https://github.com/zahrag/BIOSCAN-1M#-iv-rgb-images) each image belongs. Note that there are 10K images per chunk and 8,313 in chunk #113.
- `bioscan_filename`: unique identifier within BIOSCAN-1M dataset for images sourced from [BIOSCAN-1M](https://github.com/zahrag/BIOSCAN-1M).
- `inat21_filename`: unique identifier within iNat21 dataset for images sourced from [iNat21](https://github.com/visipedia/inat_comp/blob/master/2021/README.md).
<!-- (`file_name` given in `images` of the [`train.json`](https://github.com/visipedia/inat_comp/tree/master/2021#annotation-format) `file_name` = "train/#####_Kingdom_Phylum_..._Genus_species/STRING(uuid?).jpg"). `inat21_filename` is the end of the `file_name` string. The taxa are the `cls_name`, and the number is the `cls_num` (leading 0 may be lost here).-->
- `inat21_cls_name`: `<Kingdom>_<Phylum>_<Class>_<Order>_<Family>_<Genus>_<species>` as labeled by iNaturalist.
- `inat21_cls_num`: Number assigned by iNat21 to the given species (unique identifier for that species within iNat21 dataset).
The remaining terms describe the _Linnaean taxonomy_ of the subject of the image; they are sourced as described in [Annotation Process, below](#annotation-process).
- `kingdom`: kingdom to which the subject of the image belongs (`Animalia`, `Plantae`, `Fungi`, `Chromista`, `Protozoa`, `Bacteria`, `Viridiplantae`, `Protista`, `Orthornavirae`, `Bamfordvirae`, `Archaea`, or `Shotokuvirae`). Note: this large number of kingdoms are considered in recognition of the fact that there is not agreement on merging them.
- `phylum`: phylum to which the subject of the image belongs.
- `class`: class to which the subject of the image belongs.
- `order`: order to which the subject of the image belongs.
- `family`: family to which the subject of the image belongs.
- `genus`: genus to which the subject of the image belongs.
- `species`: species to which the subject of the image belongs.
- `common`: common name associated with the subject of the image where available. Otherwise, this is the scientific name (`genus-species`), else whatever subset of the taxonomic hierarchy is available (eg., `kingdom-phylum-class-order` or `kingdom-phylum-class-order-family`). All images have a non-null entry for this column.
Note that the `species` column occasionally has entries such as "sp. ___(get ex)" with some string following. This seems to be used to indicate the species is unknown, but various specimens/images are known to be the same species. Additionally, for `species` values containing an `x` between names, this is indicative of a hybrid that is a cross of the two species listed on either side of the `x`.
##### Text Types
| Text Type | Example |
| ---- | -------- |
| Common | black-billed magpie |
| Scientific | _Pica hudsonia_ |
| Taxonomic | _Animalia Chordata Aves Passeriformes Corvidae Pica hudsonia_ |
`species_level_taxonomy_chains.csv`: CSV with the ITIS taxonomic hierarchy, indicated as follows:
- `hierarchy_string_tsn`: string of Taxonomic Serial Numbers (TSN)* for the names of the ranks provided from highest to lowest, connected by dashes (eg., `202422-846491-660046-846497-846508-846553-954935-5549-5550`).
- `hierarchy_string_names`: string of the names of the ranks provided from highest to lowest, connected by arrows (eg., `Plantae->Biliphyta->Rhodophyta->Cyanidiophytina->Cyanidiophyceae->Cyanidiales->Cyanidiaceae->Cyanidium->Cyanidium caldarium`).
- `terminal_tsn`: Taxonomic Serial Number (TSN)* of designated species (eg., `5550`).
- `terminal_scientific_name`: scientific name (`<Genus> <species>`) of subject.
- `terminal_vernacular`: vernacular or common name(s) of the subject, multiple names are separated by commas (eg., `rockskipper`, `Highland Small Rice Rat, Páramo Colilargo`).
- `terminal_vernacular_lang`: language(s) of the vernacular name(s) provided; when there are multiple names, language is listed for each, separated by commas (eg., `English`, `English, English`, respectively for the vernacular name examples above).
- `hierarchy_string_ranks`: string of ranks provided from highest to lowest, connected by arrows (eg., `Kingdom->Subkingdom->Phylum->Subphylum->Class->Order->Family->Genus->Species`).
The remaining columns consist of the hierarchy string ranks describing the Linnaean taxonomy of the subject (as defined above), with `<Genus> <species>` filled in the `Species` column.
*ITIS assigns a Taxonomic Serial Number (TSN) to each taxonomic rank; this is a stable and unique ID.
`taxon.tab`: Tab-delimited file with taxonomic information for EOL images based on EOL page IDs.
- `taxonID`: unique identifier for the file.
- `source`: often `<source>:<id>` where the source corresponds to the domain of the `furtherInformationURL`. The ID likely corresponds to an ID at the source.
- `furtherInformationURL`: URL with more information on the indicated taxon.
- `acceptedNameUsageID`: `taxonID` for the name accepted to represent this entry. Less than a third of these are non-null
- `parentNameUsageID`: `taxonID` of taxonomic rank above the indicated `taxonRank` in the hierarchy (eg., the `taxonID` of the genus `Atadinus` for the `Atadinus fallax (Boiss.) Hauenschild` entry).
- `scientificName`: scientific name associated with the EOL page (`<canonicalName> <authority>`, authority as available).
- `taxonRank`: lowest rank of the taxonomic tree indicated (eg., `genus` or `species`), occasionally not indicated, even for accepted names.
- `taxonomicStatus`: whether the name is accepted by EOL or not (`accepted` or `not accepted`, correspond to existence of non-null `eolID` or `acceptedNameUsageID` entry, respectively).
- `datasetID`: generally corresponds to the source identified in `source` column.
- `canonicalName`: the name(s) associate with the `taxonRank` (eg., `<Genus> <species>` for species).
- `authority`: usually name of person who assigned the name, with the year as available.
- `eolID`: the EOL page ID (only non-null when `taxonomicStatus` is accepted by EOL).
- `Landmark`: numeric values, meaning unknown, mostly null.
- `higherClassification`: labeling in the EOL Dynamic Hierarchy above the `taxonRank` (eg., `Life|Cellular Organisms|Eukaryota|Opisthokonta|Metazoa|Bilateria|Protostomia|Ecdysozoa|Arthropoda|Pancrustacea|Hexapoda|Insecta|Pterygota|Neoptera|Endopterygota|Coleoptera|Adephaga|Carabidae|Paussus`).
`licenses.csv`: File with license, source, and copyright holder associated to each image from EOL listed in `catalog.csv`; `treeoflife_id` is the shared unique identifier to link the two files. Columns are
- `treeoflife_id`, `eol_content_id`, and `eol_page_id` are as defined above.
- `md5`: MD5 hash of the image.
- `medium_source_url`: URL pointing to source of image.
- `eol_full_size_copy_url`: URL to access the full-sized image; this is the URL from which the image was downloaded for this dataset (see [Initial Data Collection and Normalization](#initial-data-collection-and-normalization) for more information on this process).
- `license_name`: name of license attached to the image (eg., `cc-by`).
- `copyright_owner`: copyright holder for the image, filled with `not provided` if no copyright owner was provided.
- `license_link`: URL to the listed license, left null in the case that `License Name` is `No known copyright restrictions`.
- `title`: title provided for the image, filled with `not provided` if no title was provided.
### Data Splits
As noted above, the `split` column of `catalog.csv` indicates to which split each image belongs. Note that `train_small` is a 1M-image, uniformly sampled, subset of `train` used for fine-tuned ablation training and all entries with this label are also listed with the `train` label. The `val` label is applied to images used for validation.
10 biologically-relevant datasets were used for various tests of [BioCLIP](https://huggingface.co/imageomics/bioclip) (which was trained on this dataset), they are described (briefly) and linked to below.
#### Test Sets
- [Meta-Album](https://paperswithcode.com/dataset/meta-album): Specifically, we used the Plankton, Insects, Insects 2, PlantNet, Fungi, PlantVillage, Medicinal Leaf, and PlantDoc datasets from Set-0 through Set-2 (Set-3 was still not released as of our publication/evaluation (Nov. 2023).
- [Birds 525](https://www.kaggle.com/datasets/gpiosenka/100-bird-species): We evaluated on the 2,625 test images provided with the dataset.
- [Rare Species](https://huggingface.co/datasets/imageomics/rare-species): A new dataset we curated for the purpose of testing this model and to contribute to the ML for Conservation community. It consists of 400 species labeled Near Threatened through Extinct in the Wild by the [IUCN Red List](https://www.iucnredlist.org/), with 30 images per species. For more information, see our dataset, [Rare Species](https://huggingface.co/datasets/imageomics/rare-species).
For more information about the contents of these datasets, see Table 2 and associated sections of [our paper](https://doi.org/10.48550/arXiv.2311.18803).
## Dataset Creation
### Curation Rationale
Previously, the largest ML-ready biology image dataset was [iNat21](https://github.com/visipedia/inat_comp/tree/master/2021), which consists of 2.7M images of 10K species. This is significant breadth when comparing to popular general-domain datasets, such as [ImageNet-1K](https://huggingface.co/datasets/imagenet-1k); 10K species are rather limited when considering the vast scope of biology. For context, in 2022, [The International Union for Conservation of Nature (IUCN)](https://www.iucnredlist.org/) reported over 2M total described species, with over 10K distinct species of birds and reptiles alone. Thus, the lesser species diversity of iNat21 limits its potential for pre-training a foundation model for the entire tree of life.
With this focus on species diversity and the need for high-quality images of biological organisms, we looked to the [Encyclopedia of Life Project (EOL)](https://eol.org/). EOL is an image aggregator that collaborates with a variety of institutions to source and label millions of images. After downloading 6.6M images from EOL, we were able to expand our dataset to cover an additional 440K taxa.
Insects (of the class Insecta with 1M+ species), birds (of the class Aves with 10K+ species) and reptiles (of the class Reptilia with 10K+ species) are examples of highly diverse subtrees with many more species than other taxonomic classes. This imbalance among subtrees in the tree of life present challenges in training a foundation model that can recognize extremely fine-grained visual representations of these classes. To help address this challenge for insects, we incorporated [BIOSCAN-1M](https://zenodo.org/doi/10.5281/zenodo.8030064), a recent dataset of 1M expert-labeled lab images of insects, covering 494 different families. The added variety of lab images, rather than in situ images (as in iNat21), further diversifies the _image_ distribution of TreeOfLife-10M.
Overall, this dataset contains approximately 454K unique taxonomic labels of the more than 2M recorded by [IUCN](iucnredlist.org) in 2022. To the best of our knowledge, this is still the most diverse and largest such ML-ready dataset available, hence our curation.
### Source Data
[iNat21 data](https://github.com/visipedia/inat_comp/tree/master/2021#data) was downloaded, unzipped, and our compilation scripts pointed to the training split. As per their [terms of use](https://github.com/visipedia/inat_comp/tree/master/2021#terms-of-use), the data is catalogued, but not reproduced, here.
[BIOSCAN-1M](https://zenodo.org/doi/10.5281/zenodo.8030064): Collection of insect images hand-labeled by experts.
[EOL](https://eol.org/): Biological image aggregator.
#### Initial Data Collection and Normalization
[iNat21 training data](https://github.com/visipedia/inat_comp/tree/master/2021#data) and [BIOSCAN-1M data](https://zenodo.org/doi/10.5281/zenodo.8030064) were downloaded and assigned `treeoflife_id`s for unique identification within the TreeOfLife-10M dataset. The iNat21 training data is formatted into a webdataset format prior to `treeoflife_id` assignments, since this is also used for a comparison to [BioCLIP](https://huggingface.co/imageomics/bioclip) as trained on the full TreeOfLife-10M dataset. For more detailed information on this process, please see [How to Create TreeOfLife-10M](https://github.com/Imageomics/bioclip/tree/main/docs/imageomics/treeoflife10m.md#how-to-create-treeoflife-10m) in the BioCLIP GitHub repo.
First, media manifest data was sourced from EOL using [this script](https://github.com/Imageomics/bioclip/blob/main/scripts/get_media_manifest.py). The media manifest includes EOL content and page IDs from which to connect the taxonomic information, along with source URLs and licensing information. The `EOL Full-Size Copy URL` was then used to download all the images, naming each `<eol_content_id>_<eol_page_id>_eol_full-size-copy.jpg` for reference back to the media manifest. [Scripts](https://github.com/Imageomics/bioclip/tree/main/scripts/evobio10m) to perform these downloads and [instructions](https://github.com/Imageomics/bioclip/blob/main/docs/imageomics/treeoflife10m.md) can be found in the [BioCLIP GitHub repository](https://github.com/Imageomics/bioclip).
See [below](#Annotation-Process) for details of annotation following data collection.
Species selected for the Rare Species dataset were removed from this dataset (see [Initial Data Collection and Normalization of Rare Species](https://huggingface.co/datasets/imageomics/rare-species#initial-data-collection-and-normalization)).
### Annotations
#### Annotation Process
Annotations were primarily sourced from image source providers.
For iNat21 and BIOSCAN-1M images, the labels provided by those sources were used.
- iNat21: iNaturalist English vernacular names and taxa were used.
- BIOSCAN-1M: Linnaean taxonomic rankings were applied as labeled in the [BIOSCAN-1M dataset](https://zenodo.org/doi/10.5281/zenodo.8030064), which is all hand-labeled by experts. Note that the dataset provides other ranks (not considered in the 7-rank Linnaean taxonomy), such as tribe, which were not included in this dataset.
For images from EOL, the scientific name (`genus-species`) was used to look up the higher-order taxa from the following sources as listed: BIOSCAN-1M metadata, EOL aggregate datasets (described below), then match this against the ITIS hierarchy for the higher-order taxa standardization. A small number of these are [homonyms](https://en.wikipedia.org/wiki/Homonym_(biology)), for which a list was generated to ensure proper matching of higher-order taxa (manual homonym resolution is in class `NameUpgrader` in the [naming script](https://github.com/Imageomics/bioclip/blob/main/src/imageomics/naming.py)). After these resources were exhausted, any remaining unresolved taxa were fed through the [Global Names Resolver (GNR) API](https://resolver.globalnames.org/api). Despite our efforts, we discovered after training that some hemihomonyms were mislabeled at higher-level taxa (family up to kingdom). This impacts approximately 0.1-0.2% of our data. We are in the process of developing a more robust solution to taxonomic labeling which will also account for re-naming (as is currently in process for many bird species). We intend to release a patch alongside the solution.
This process allowed us to reach full taxa labels for 84% of images. To put this in perspective, 10% of images in TreeOfLife-10M are only labeled to the `family` level (no `genus-species` designations) as part of BIOSCAN-1M, so this places a cap on the taxa coverage. Taxonomic ranking also is not entirely standardized and agreed-upon throughout the biology community, so most gaps are more indicative of lack of consensus on label than missing information.
#### Who are the annotators?
Samuel Stevens, Jiaman Wu, Matthew J. Thompson, and Elizabeth G. Campolongo
### Personal and Sensitive Information
N/A
## Considerations for Using the Data
### Social Impact of Dataset
The hope is that this dataset could be helpful in conservation efforts or biodiversity research.
### Discussion of Biases and Other Known Limitations
This dataset is imbalanced in its representation of various species with the greatest representation available for those in the phyla _Arthropoda_, _Tracheophyta_, and _Chordata_ (see our [interactive treemap from phylum to family](https://huggingface.co/imageomics/treeoflife-10m/raw/main/phyla_ToL_tree.html) for further details of this distribution). This class imbalance is both a result of availability of images and actual variance in class diversity. Additionally, as noted above, there are 2M+ estimated species according to [IUCN](iucnredlist.org), so overall taxonomic coverage is still limited (though it far surpasses the species diversity of other well-known animal datasets).
Not all data is labeled to the species level, and some entries are more or less precise. For instance, the `species` column occasionally has entries such as "sp. ___(get ex)" with some string following. This seems to be used to indicate the species is unknown, but various specimens/images are known to be the same species. Additionally, for `species` values containing an `x` between names, this is indicative of a hybrid that is a cross of the two species listed on either side of the `x`. Due to the additional information provided about the higher order taxa, these labeling anomalies still present valuable information providing links between these classes.
As stated above, 84% of images have full taxa labels. However, due to the incomplete standardization and agreement on the taxonomic hierarchy throughout the biology community, most gaps are more indicative of lack of consensus on label than missing information.
Note that BIOSCAN-1M’s label granularity may still be limited for insects, as 98.6% of BIOSCAN-1M’s images are labeled to the family level but only 22.5% and 7.5% of the images have genus or species indicated, respectively. Lack of label granularity is an inherent challenge.
## Additional Information
### Dataset Curators
Samuel Stevens, Jiaman Wu, Matthew J. Thompson, and Elizabeth G. Campolongo
### Licensing Information
The data (images and text) contain a variety of licensing restrictions mostly within the CC family. Each image and text in this dataset is provided under the least restrictive terms allowed by its licensing requirements as provided to us (i.e, we impose no additional restrictions past those specified by licenses in the license file).
Please see the [iNat21 terms of use](https://github.com/visipedia/inat_comp/tree/master/2021#terms-of-use) for full information on use of their images.
All BIOSCAN-1M images are licensed under [CC BY-NC-SA 4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/).
EOL images contain a variety of licenses ranging from [CC0](https://creativecommons.org/publicdomain/zero/1.0/) to [CC BY-NC-SA](https://creativecommons.org/licenses/by-nc-sa/4.0/).
For license and citation information by image, see our [license file](https://huggingface.co/datasets/imageomics/treeoflife-10m/blob/main/metadata/licenses.csv).
**Note**: Due to licensing restrictions discovered after training, approximately 30K of the images used to train BioCLIP (about 0.3%) cannot be republished here and links to original content are no longer available. Overall, 14 families that were included in training BioCLIP are not republished in this dataset, a loss of 0.38% of the taxa diversity.
This dataset (the compilation) has been marked as dedicated to the public domain by applying the [CC0 Public Domain Waiver](https://creativecommons.org/publicdomain/zero/1.0/). However, images may be licensed under different terms (as noted above).
### Citation Information
```
@dataset{treeoflife_10m,
author = {Samuel Stevens and Jiaman Wu and Matthew J Thompson and Elizabeth G Campolongo and Chan Hee Song and David Edward Carlyn and Li Dong and Wasila M Dahdul and Charles Stewart and Tanya Berger-Wolf and Wei-Lun Chao and Yu Su},
title = {TreeOfLife-10M},
year = {2023},
url = {https://huggingface.co/datasets/imageomics/TreeOfLife-10M},
doi = {10.57967/hf/1972},
publisher = {Hugging Face}
}
```
Please also cite our paper:
```
@article{stevens2023bioclip,
title = {BIOCLIP: A Vision Foundation Model for the Tree of Life},
author = {Samuel Stevens and Jiaman Wu and Matthew J Thompson and Elizabeth G Campolongo and Chan Hee Song and David Edward Carlyn and Li Dong and Wasila M Dahdul and Charles Stewart and Tanya Berger-Wolf and Wei-Lun Chao and Yu Su},
year = {2023},
eprint = {2311.18803},
archivePrefix = {arXiv},
primaryClass = {cs.CV}}
```
Please be sure to also cite the original data sources and all constituent parts as appropriate.
- iNat21:
```
@misc{inat2021,
author={Van Horn, Grant and Mac Aodha, Oisin},
title={iNat Challenge 2021 - FGVC8},
publisher={Kaggle},
year={2021},
url={https://kaggle.com/competitions/inaturalist-2021}
}
```
- BIOSCAN-1M:
```
@inproceedings{gharaee2023step,
title={A Step Towards Worldwide Biodiversity Assessment: The {BIOSCAN-1M} Insect Dataset},
booktitle = {Advances in Neural Information Processing Systems ({NeurIPS}) Datasets \& Benchmarks Track},
author={Gharaee, Z. and Gong, Z. and Pellegrino, N. and Zarubiieva, I. and Haurum, J. B. and Lowe, S. C. and McKeown, J. T. A. and Ho, C. Y. and McLeod, J. and Wei, Y. C. and Agda, J. and Ratnasingham, S. and Steinke, D. and Chang, A. X. and Taylor, G. W. and Fieguth, P.},
year={2023},
}
```
- EOL: Encyclopedia of Life. Available from http://eol.org. Accessed 29 July 2023.
For license and citation information by image, see our [license file](https://huggingface.co/datasets/imageomics/treeoflife-10m/blob/main/metadata/licenses.csv).
- ITIS: Retrieved July, 20 2023, from the Integrated Taxonomic Information System (ITIS) on-line database, www.itis.gov, CC0
https://doi.org/10.5066/F7KH0KBK
### Contributions
The [Imageomics Institute](https://imageomics.org) is funded by the US National Science Foundation's Harnessing the Data Revolution (HDR) program under [Award #2118240](https://www.nsf.gov/awardsearch/showAward?AWD_ID=2118240) (Imageomics: A New Frontier of Biological Information Powered by Knowledge-Guided Machine Learning). Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.
|
mask-distilled-one-sec-cv12/chunk_34 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 645792900
num_examples: 126825
download_size: 659289645
dataset_size: 645792900
---
# Dataset Card for "chunk_34"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jemfu/mteb_test | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
splits:
- name: corpus
num_bytes: 34
num_examples: 2
- name: queries
num_bytes: 36
num_examples: 2
download_size: 2298
dataset_size: 70
configs:
- config_name: default
data_files:
- split: corpus
path: data/corpus-*
- split: queries
path: data/queries-*
---
|
tollefj/norwegian-nli-triplets | ---
dataset_info:
features:
- name: anchor
dtype: string
- name: entailment
dtype: string
- name: contradiction
dtype: string
splits:
- name: train
num_bytes: 88455406
num_examples: 551015
download_size: 39831572
dataset_size: 88455406
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "norwegian-nli-triplets"
A reformatting of compatible triplets from https://huggingface.co/datasets/tollefj/all-nli-NOB.
This includes all pairs that contain both a contradiction and an entailment.
In cases where a neutral also exists, this is a duplicated triplet, with the same contr./ent.
Simple normalization of sentences:
```python
from neattext.functions import clean_text
import re
def symbol_cleaner(s):
s = re.sub(r"^[^\w\d]+", "", s)
s = re.sub(r"[^\w\d]+$", "", s)
return s
def filter_sent(sent):
sent = clean_text(sent, puncts=False, stopwords=False, non_ascii=False)
sent = symbol_cleaner(sent)
return sent
def filter_triplet(triplet):
return [filter_sent(sent) for sent in triplet]
```
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
harpreetsahota/gemma_vibecheck_rtprompts | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: DeciLM-7B-Instruct
dtype: string
- name: Gemma-7B-it
dtype: string
- name: coherence_DeciLM-7B-Instruct
struct:
- name: reasoning
dtype: string
- name: score
dtype: int64
- name: value
dtype: string
- name: coherence_Gemma-7B-it
struct:
- name: reasoning
dtype: string
- name: score
dtype: int64
- name: value
dtype: string
splits:
- name: train
num_bytes: 94014
num_examples: 30
download_size: 55270
dataset_size: 94014
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
TinyPixel/s_2 | ---
dataset_info:
features:
- name: human
dtype: string
- name: bot
dtype: string
splits:
- name: train
num_bytes: 36254174
num_examples: 69374
download_size: 18670866
dataset_size: 36254174
---
# Dataset Card for "s_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/agatsuma_kaede_alicegearaegisexpansion | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Agatsuma Kaede
This is the dataset of Agatsuma Kaede, containing 36 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 36 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 83 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 96 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 36 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 36 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 36 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 83 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 83 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 67 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 96 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 96 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
TioRob/Larin | ---
license: openrail
---
|
iamnguyen/alpaca-chat | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 115530761
num_examples: 12062
- name: test
num_bytes: 5912279
num_examples: 635
download_size: 51467816
dataset_size: 121443040
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
dvrkdvys/SZA_Speaking | ---
license: openrail
---
|
jaty54/datapython | ---
license: mit
---
|
Rivoks/movingthings | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: labels
dtype: string
splits:
- name: train
num_bytes: 109524487.0
num_examples: 233
download_size: 109529708
dataset_size: 109524487.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card in progress... |
opsci/Astree | ---
license: cc0-1.0
---
Aſtrée† is a repository of 2000 early modern instructions in French drawn from 162 French novels published between 1600 and 1700. Aſtrée can be used to fine-tuned any LLM on early modern French.
All the instructions have been created from one page excerpts extracting from public domain works with historical writing and typography. They may include OCR errors that should not affect significantly the quality of text generation.
Beyond their cultural relevance, Aſtrée provides a very good sample for fine-tuning demonstration, as even a small set will dramatically change the output of the LLM.
†*The correct name could not be used in the title due to alphanumerical restrictions on accented characters and historical long s. It is obviously inspired by one of the most celebrated novel from French 17th literature, L'Aſtrée.* |
zhyzzz/autotrain-data-logic_form_generation | ---
task_categories:
- summarization
---
# AutoTrain Dataset for project: logic_form_generation
## Dataset Description
This dataset has been automatically processed by AutoTrain for project logic_form_generation.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "find \\tan S.",
"target": "[Find(TanOf(Angle(S)))]."
},
{
"text": "Find x so that the quadrilateral is a parallelogram.",
"target": "[Find(x)] so [Parallelogram($)]"
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"target": "Value(dtype='string', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 2397 |
| valid | 600 |
|
dylanebert/igf-results | ---
license: mit
---
|
fathyshalab/reklamation24_oeffentlichkeit-soziales | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: label_name
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 85044
num_examples: 152
- name: test
num_bytes: 21399
num_examples: 39
download_size: 0
dataset_size: 106443
---
# Dataset Card for "reklamation24_oeffentlichkeit-soziales"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/alice_margatroid_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of alice_margatroid/アリス・マーガトロイド/앨리스마가트로이드 (Touhou)
This is the dataset of alice_margatroid/アリス・マーガトロイド/앨리스마가트로이드 (Touhou), containing 500 images and their tags.
The core tags of this character are `blonde_hair, short_hair, hairband, blue_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 648.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/alice_margatroid_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 400.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/alice_margatroid_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1180 | 801.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/alice_margatroid_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 586.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/alice_margatroid_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1180 | 1.05 GiB | [Download](https://huggingface.co/datasets/CyberHarem/alice_margatroid_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/alice_margatroid_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, capelet, sash, solo, black_pantyhose, blue_dress, lace-up_boots, smile, wrist_cuffs, blush, open_mouth, ribbon |
| 1 | 5 |  |  |  |  |  | 1girl, capelet, sash, solo, blue_dress, bow, looking_at_viewer, smile, blush, book |
| 2 | 7 |  |  |  |  |  | 1girl, capelet, dress, sash, smile, solo, open_mouth, book, bow |
| 3 | 7 |  |  |  |  |  | 1girl, capelet, sash, simple_background, solo, white_background, smile, blue_dress, looking_at_viewer, long_sleeves |
| 4 | 7 |  |  |  |  |  | 1girl, blue_dress, capelet, sash, solo, looking_at_viewer, puppet_strings, ribbon, lolita_hairband, short_sleeves, bow, simple_background, white_background |
| 5 | 27 |  |  |  |  |  | 1girl, looking_at_viewer, solo, blue_dress, hair_between_eyes, red_hairband, white_capelet, bangs, frills, smile, closed_mouth, simple_background, blush, upper_body, white_background, lolita_hairband, breasts, red_necktie, puffy_short_sleeves |
| 6 | 5 |  |  |  |  |  | 1girl, capelet, dress, sash, solo, book, petals, flower, on_side |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | capelet | sash | solo | black_pantyhose | blue_dress | lace-up_boots | smile | wrist_cuffs | blush | open_mouth | ribbon | bow | looking_at_viewer | book | dress | simple_background | white_background | long_sleeves | puppet_strings | lolita_hairband | short_sleeves | hair_between_eyes | red_hairband | white_capelet | bangs | frills | closed_mouth | upper_body | breasts | red_necktie | puffy_short_sleeves | petals | flower | on_side |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------|:-------|:-------|:------------------|:-------------|:----------------|:--------|:--------------|:--------|:-------------|:---------|:------|:--------------------|:-------|:--------|:--------------------|:-------------------|:---------------|:-----------------|:------------------|:----------------|:--------------------|:---------------|:----------------|:--------|:---------|:---------------|:-------------|:----------|:--------------|:----------------------|:---------|:---------|:----------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | | X | | X | | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | X | | | | X | | | X | | X | | X | X | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | X | X | | X | | X | | | | | | X | | | X | X | X | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | X | X | X | | X | | | | | | X | X | X | | | X | X | | X | X | X | | | | | | | | | | | | | |
| 5 | 27 |  |  |  |  |  | X | | | X | | X | | X | | X | | | | X | | | X | X | | | X | | X | X | X | X | X | X | X | X | X | X | | | |
| 6 | 5 |  |  |  |  |  | X | X | X | X | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | X | X | X |
|
YosefLab-classes/lung_cell_atlas-core | ---
license:
- unknown
converted_from: zenodo
zenodo_id: '7897022'
---
# Dataset Card for Unintegrated lung cell atlas
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://zenodo.org/record/7897022
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
<p>Dataset from the Lung cell atlas study:</p>
<p>https://www.biorxiv.org/content/10.1101/2022.03.10.483747v1</p>
<p>Extracted from</p>
<p>https://cellxgene.cziscience.com/collections/6f6d381a-7701-4781-935c-db10d30de293</p>
<p> </p>
<p>The file is zstd compressed, as explained in</p>
<p>https://anndata.readthedocs.io/en/latest/generated/anndata.AnnData.write_h5ad.html</p>
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
This dataset was shared by Sikkema et al
### Licensing Information
The license for this dataset is https://creativecommons.org/licenses/by/4.0/legalcode
### Citation Information
```bibtex
@dataset{sikkema_et_al_2022_7897022,
author = {Sikkema et al},
title = {Unintegrated lung cell atlas},
month = mar,
year = 2022,
publisher = {Zenodo},
doi = {10.5281/zenodo.7897022},
url = {https://doi.org/10.5281/zenodo.7897022}
}
```
### Contributions
[More Information Needed] |
KaiLv/UDR_MNLI | ---
dataset_info:
features:
- name: idx
dtype: int64
- name: label
dtype: int64
- name: label_text
dtype: string
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 77946210
num_examples: 263789
- name: validation
num_bytes: 883710
num_examples: 3000
- name: validation_mm
num_bytes: 910699
num_examples: 3000
- name: debug
num_bytes: 29518034
num_examples: 100000
download_size: 47966458
dataset_size: 109258653
---
# Dataset Card for "UDR_MNLI"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/honolulu_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of honolulu/ホノルル/火奴鲁鲁 (Azur Lane)
This is the dataset of honolulu/ホノルル/火奴鲁鲁 (Azur Lane), containing 500 images and their tags.
The core tags of this character are `long_hair, red_hair, breasts, twintails, bangs, red_eyes, ribbon, hair_ribbon, large_breasts, black_ribbon, very_long_hair, hat`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 860.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/honolulu_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 422.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/honolulu_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1372 | 992.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/honolulu_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 730.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/honolulu_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1372 | 1.50 GiB | [Download](https://huggingface.co/datasets/CyberHarem/honolulu_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/honolulu_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, black_sailor_collar, black_skirt, blush, pleated_skirt, serafuku, short_sleeves, solo, white_shirt, beret, black_headwear, black_pantyhose, looking_at_viewer, simple_background, white_background, black_choker, official_alternate_costume, closed_mouth, sitting |
| 1 | 7 |  |  |  |  |  | 1girl, beret, black_choker, black_footwear, black_headwear, black_pantyhose, black_skirt, blush, looking_at_viewer, serafuku, short_sleeves, solo, white_shirt, black_sailor_collar, closed_mouth, full_body, holding_umbrella, loafers, official_alternate_costume, pleated_skirt, school_bag, simple_background, teddy_bear, white_background, collarbone, sitting, miniskirt |
| 2 | 11 |  |  |  |  |  | 1girl, black_thighhighs, blush, cleavage, elbow_gloves, solo, white_gloves, dress, looking_at_viewer, bare_shoulders, black_panties, chain, garter_straps, thighs, open_mouth, sitting, collarbone, peaked_cap, simple_background, white_background |
| 3 | 5 |  |  |  |  |  | 1girl, black_thighhighs, chain, cleavage, dress, elbow_gloves, garter_straps, jacket_on_shoulders, looking_at_viewer, peaked_cap, solo, white_gloves, white_headwear, blush, closed_mouth, thighs, buttons, sitting, hand_on_hip |
| 4 | 5 |  |  |  |  |  | 1girl, black_bikini, blush, cleavage, collarbone, solo, white_background, bare_shoulders, looking_at_viewer, simple_background, closed_mouth, upper_body, eyewear_on_head, huge_breasts, navel, sitting, sunglasses, wet |
| 5 | 6 |  |  |  |  |  | 1girl, black_bikini, blush, looking_at_viewer, open_mouth, solo, cleavage, collarbone, eyewear_on_head, starfish, sunglasses, water, bare_shoulders, hose, thighs, wet, ahoge, navel, star_hair_ornament, tentacles |
| 6 | 9 |  |  |  |  |  | bare_shoulders, black_bikini, blue_sky, blush, cleavage, collarbone, day, looking_at_viewer, outdoors, 1girl, cowboy_shot, eyewear_on_head, navel, ocean, solo, star_hair_ornament, stomach, sunglasses, water, cloud, holding, beach, halterneck, starfish, thighs, bare_arms, closed_mouth, hose, standing, wet |
| 7 | 7 |  |  |  |  |  | 1boy, 1girl, blush, hetero, paizuri, beach, breasts_squeezed_together, eyewear_on_head, nipples, ocean, outdoors, penis, sunglasses, day, motion_lines, open_mouth, solo_focus, collarbone, cum, huge_breasts, looking_at_viewer, mosaic_censoring, nude, pov, water, ahoge, black_bikini, blue_sky, cloud, motion_blur, star_hair_ornament |
| 8 | 28 |  |  |  |  |  | 1girl, blush, wide_sleeves, looking_at_viewer, solo, obi, blue_kimono, floral_print, yukata, hair_flower, holding, white_thighhighs, ahoge, collarbone, long_sleeves, thighs, open_mouth, closed_mouth, fireworks, night, checkered_sash |
| 9 | 7 |  |  |  |  |  | 1girl, blush, looking_at_viewer, solo, nipples, collarbone, navel, simple_background, closed_mouth, completely_nude, female_pubic_hair, white_background |
| 10 | 12 |  |  |  |  |  | 1girl, blush, hetero, white_gloves, 1boy, elbow_gloves, solo_focus, thighhighs, nipples, penis, sex, nude, open_mouth, sweat, vaginal, looking_at_viewer, mosaic_censoring, pussy, spread_legs, navel, on_back |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_sailor_collar | black_skirt | blush | pleated_skirt | serafuku | short_sleeves | solo | white_shirt | beret | black_headwear | black_pantyhose | looking_at_viewer | simple_background | white_background | black_choker | official_alternate_costume | closed_mouth | sitting | black_footwear | full_body | holding_umbrella | loafers | school_bag | teddy_bear | collarbone | miniskirt | black_thighhighs | cleavage | elbow_gloves | white_gloves | dress | bare_shoulders | black_panties | chain | garter_straps | thighs | open_mouth | peaked_cap | jacket_on_shoulders | white_headwear | buttons | hand_on_hip | black_bikini | upper_body | eyewear_on_head | huge_breasts | navel | sunglasses | wet | starfish | water | hose | ahoge | star_hair_ornament | tentacles | blue_sky | day | outdoors | cowboy_shot | ocean | stomach | cloud | holding | beach | halterneck | bare_arms | standing | 1boy | hetero | paizuri | breasts_squeezed_together | nipples | penis | motion_lines | solo_focus | cum | mosaic_censoring | nude | pov | motion_blur | wide_sleeves | obi | blue_kimono | floral_print | yukata | hair_flower | white_thighhighs | long_sleeves | fireworks | night | checkered_sash | completely_nude | female_pubic_hair | thighhighs | sex | sweat | vaginal | pussy | spread_legs | on_back |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:----------------------|:--------------|:--------|:----------------|:-----------|:----------------|:-------|:--------------|:--------|:-----------------|:------------------|:--------------------|:--------------------|:-------------------|:---------------|:-----------------------------|:---------------|:----------|:-----------------|:------------|:-------------------|:----------|:-------------|:-------------|:-------------|:------------|:-------------------|:-----------|:---------------|:---------------|:--------|:-----------------|:----------------|:--------|:----------------|:---------|:-------------|:-------------|:----------------------|:-----------------|:----------|:--------------|:---------------|:-------------|:------------------|:---------------|:--------|:-------------|:------|:-----------|:--------|:-------|:--------|:---------------------|:------------|:-----------|:------|:-----------|:--------------|:--------|:----------|:--------|:----------|:--------|:-------------|:------------|:-----------|:-------|:---------|:----------|:----------------------------|:----------|:--------|:---------------|:-------------|:------|:-------------------|:-------|:------|:--------------|:---------------|:------|:--------------|:---------------|:---------|:--------------|:-------------------|:---------------|:------------|:--------|:-----------------|:------------------|:--------------------|:-------------|:------|:--------|:----------|:--------|:--------------|:----------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 11 |  |  |  |  |  | X | | | X | | | | X | | | | | X | X | X | | | | X | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | | X | | | | X | | | | | X | | | | | X | X | | | | | | | | | X | X | X | X | X | | | X | X | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | | X | | | | X | | | | | X | X | X | | | X | X | | | | | | | X | | | X | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | | | X | | | | X | | | | | X | | | | | | | | | | | | | X | | | X | | | | X | | | | X | X | | | | | | X | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 9 |  |  |  |  |  | X | | | X | | | | X | | | | | X | | | | | X | | | | | | | | X | | | X | | | | X | | | | X | | | | | | | X | | X | | X | X | X | X | X | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 7 |  |  |  |  |  | X | | | X | | | | | | | | | X | | | | | | | | | | | | | X | | | | | | | | | | | | X | | | | | | X | | X | X | | X | | | X | | X | X | | X | X | X | | X | | X | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 8 | 28 |  |  |  |  |  | X | | | X | | | | X | | | | | X | | | | | X | | | | | | | | X | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 9 | 7 |  |  |  |  |  | X | | | X | | | | X | | | | | X | X | X | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | |
| 10 | 12 |  |  |  |  |  | X | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | X | X | | | | | | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | X | | | X | X | | X | | X | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X |
|
declare-lab/HumanEval_CORE | ---
license: apache-2.0
---
Dataset introduced in the paper _Caught in the Quicksand of Reasoning, Far from AGI Summit: Evaluating LLMs' Mathematical and Coding Competency through Ontology-guided Interventions_.
This dataset was created by randomly sampling five questions from HumanEval(openai) and perturbing them using an ontology.
<img src="https://raw.githubusercontent.com/declare-lab/llm_robustness/9a358fc0a331b63ffa3047fb3907dd92abd85b0a/assets/ontology_uni.png" alt="Image" width="800" height="800">
# Performance of LLMs on CORE
| Domain | Original | Logic Alteration | | | | Avg. | Concept Analysis | | | Avg. | Format Change | | Avg. | Form. Constraint | Weighted Avg. |
|--------|----------|------------------|---|---|---|------|------------------|---|---|------|---------------|---|------|------------------|---------------|
| Dimension | | Quest. Simpl. | Reason Adjust. | Compute. Adjust. | Symbol Manip. | Perf. | Quest. Under. | Sol. Eval. | Error Debug | Perf. | Alt. Format | Pair. Comp. | Perf. | Answer Constraint | |
| **GPT-4** | 80 | 90 | 37.5 | 46.67 | 50 | 52.29 | 65 | 80 | 44 | 61.54 | 65 | 40 | 60.00 | 55 | 56.70 |
| **GPT-3.5** | 80 | 73.68 | 35 | 40 | 29.41 | 40.74 | 60 | 75 | 40 | 56.92 | 50 | 40 | 48.00 | 45 | 47.09 |
| **Gemini** | 80 | 80 | 32.5 | 53.33 | 23.53 | 41.28 | 65 | 75 | 44 | 60.00 | 45 | 40 | 44.00 | 35 | 47.32 |
| **Llama2-Chat** | 60 | 45 | 12.5 | 33.33 | 11.76 | 21.10 | 50 | 50 | 8 | 33.85 | 25 | 40 | 28.00 | 20 | 36.61 |
| **CodeLlama** | 60 | 80 | 40 | 40 | 11.76 | 38.53 | 35 | 35 | 28 | 32.31 | 40 | 0 | 32.00 | 40 | 26.34 |
| **Average** | 72 | 73.74 | 31.5 | 42.67 | 25.29 | 38.79 | 55 | 63 | 32.8 | 48.92 | 45 | 32 | 42.40 | 39.00 | 42.81 |
|
dmcooller/schevchenko-mistral | ---
dataset_info:
features:
- name: example
dtype: string
splits:
- name: train
num_bytes: 358424
num_examples: 238
- name: test
num_bytes: 59898
num_examples: 41
download_size: 228323
dataset_size: 418322
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
male-2/training_v3-public | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: type
dtype: string
splits:
- name: train
num_bytes: 983
num_examples: 1
download_size: 3293
dataset_size: 983
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bernardocecchetto/BirdCLEF-Challenge2023-Kaggle | ---
license: apache-2.0
---
This dataset contains audios of 264 species of birds singing that were all processed. It was processed as follows:
1. Stereo to Mono
2. Resampled 16kHz
3. High Pass Filter (1500Hz and filter order of 16)
4. Normalized
The raw dataset was provided by the BirdCLEF 2023 challenge from Kaggle. You can access it in https://www.kaggle.com/competitions/birdclef-2023/data
|
furry-br/ei-nerd | ---
license: openrail
---
|
webimmunization/COVID-19-vaccine-attitude-tweets | ---
annotations_creators:
- crowdsourced
language_creators:
- other
language:
- en
license: [cc-by-4.0]
multilinguality:
- monolingual
pretty_name: twitter covid19 tweets
size_categories:
- 54KB
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- sentiment-classification
- intent-classification
---
# Dataset Card for COVID-19-vaccine-attitude-tweets
## Dataset Description
- **Paper:** [Be Careful Who You Follow. The Impact of the Initial Set of Friends on COVID-19 Vaccine tweets](https://www.researchgate.net/publication/355726080_Be_Careful_Who_You_Follow_The_Impact_of_the_Initial_Set_of_Friends_on_COVID-19_Vaccine_Tweets)
- **Point of Contact:** [Izabela Krysinska](izabela.krysinska@doctorate.put.poznan.pl)
### Dataset Summary
The dataset consists of 2564 manually annotated tweets related to COVID-19 vaccines. The dataset can be used to discover the attitude expressed in the tweet towards the subject of COVID-19 vaccines. Tweets are in English. The dataset was curated in such a way as to maximize the likelihood of tweets with a strong emotional tone. We have assumed the existence of three classes:
- PRO (label 0): positive, the tweet unequivocally suggests support for getting vaccinated against COVID-19
- NEUTRAL (label 1): the tweet is mostly informative, does not show emotions vs. presented information, contains strong positive or negative emotions but concerning politics (vaccine distribution, vaccine passports, etc.)
- AGAINST (label 2): the tweet is clearly against vaccination and contains warnings, conspiracy theories, etc.
The dataset does not contain the content of Twitter statuses. Original tweets can be obtained via Twitter API.
You can use [`twitter`](https://python-twitter.readthedocs.io/en/latest/index.html) library:
```python
import twitter
from datasets import load_dataset
api = twitter.Api(consumer_key=<consumer key>,
consumer_secret=<consumer secret>,
access_token_key=<access token>,
access_token_secret=<access token secret>,
sleep_on_rate_limit=True)
tweets = load_dataset('webimmunization/COVID-19-vaccine-attitude-tweets')
def add_tweet_content(example):
try:
status = api.GetStatus(tweet_id)
except twitter.TwitterError as err:
print(err)
status = {'text': None}
return {'status': status.text}
tweets_with_text = tweets.map(add_tweet_content)
```
### Supported Tasks and Leaderboards
- `text-classification`: The dataset can be used to discover the attitude expressed in the tweet towards the subject of COVID-19 vaccines, whether the tweet presents a positive, neutral or negative attitude. Success on this task can be measured by achieving a *high* AUROC or [F1](https://huggingface.co/metrics/f1).
### Languages
[EN] English.
The text that can be accessed via the Twitter API using the identifiers in this dataset is in English.
## Dataset Structure
### Data Instances
The 1st column is Twitter Status ID and the 2nd column is the label denoting the attitude towards vaccines against COVID-19.
Example:
```
{
'id': '1387627601955545089',
'attitude': 0 # positive attitude
}
```
### Data Fields
- `attitude`: attitude towards vaccines against COVID-19. `0` denotes positive attitude, `1` denotes neutral attitude, `2` dentoes negative attitude.
- `id`: Twitter status id
### Data Splits
[Needs More Information]
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
Social media posts.
#### Initial Data Collection and Normalization
We queried the Twitter search engine with manually curated hashtags such as \#coronavaccine, \#getvaccinated, #mRNA, #PfizerGang, #VaccineNoThankYou, #vaccinesWork, #BillGatesVaccine, #VaccinesKill, etc. to fetch tweets related to COVID-19 vaccines. Then we have searched for tweets with conspicuous emotional load, both negative and positive. Once we had the set of emotionally loaded tweets we started fetching other tweets posted by the authors of emotional tweets. We'd been collecting tweets from mid of April for about a month. Then we filtered out tweets that were not related to the vaccines. In this manner, we collected tweets that are more probable to be emotional rather than strictly informative.
#### Who are the source language producers?
The language producers are users of Twitter.
### Annotations
#### Annotation process
We have manually annotated over 2500 tweets using the following annotation protocol. We have assumed the existence of three classes:
- PRO (label 0): positive, the tweet unequivocally suggests support for getting vaccinated against COVID-19
- NEUTRAL(label 1): the tweet is mostly informative, does not show emotions vs. presented information, contains strong positive or negative emotions but concerning politics (vaccine distribution, vaccine passports, etc.)
- AGAINST(label 2): the tweet is clearly against vaccination and contains warnings, conspiracy theories, etc.
The PRO class consists of tweets which explicitly urge people to go get vaccinated. The AGAINST class contains tweets which explicitly warn people against getting the vaccine.
Tweet annotation has been conducted using [Prodigy](https://prodi.gy) tool. The annotators were provided with the following instructions:
- Do not spend too much time on a tweet and try to make a quick decision, the slight discrepancy in labeling (especially if you are deciding between *PRO* and *NEUTRAL*) will not affect the classifier significantly.
- Assign tweets that seem to originate from news sites as *NEUTRAL* and use *PRO* for tweets that express unequivocal support for getting the vaccine.
- There are many tweets on vaccination and politics. They should fall into the *NEUTRAL* class unless they contain a clear call to action: go get vaccinated!
- Use only the contents of the tweet to label it, do not open the links if the content of a tweet is not enough for labeling (e.g., “Hmm, interesting, https://t.co/ki345o2i345”), skip such tweets instead of giving it a label.
- Use the option to skip a tweet only when there is nothing in the tweet except for an URL or a few meaningless words, otherwise do not hesitate to put the tweet in the *NEUTRAL* class.
We have asked 8 annotators to annotate the same set of 100 tweets using the guidelines proposed in the annotation protocol to verify the annotation protocol. We have measured the interrater agreement using the Fliess' kappa coefficient <cite>[Fleiss 1971][2]</cite>. The results were as follows:
- when measuring the agreement with four possible classes (*PRO*, *NEUTRAL*, *AGAINST*, *NONE*, where the last class represents tweets that were rejected from annotation), the agreement is `kappa=0.3940`
- when measuring the agreement after removing tweets that were rejected, the agreement is `kappa=0.3560`
- when measuring the agreement if rejected tweets are classified as *NEUTRAL*, the agreement is `kappa=0.3753`
- when measuring the agreement for only two classes (using *PRO*, *NEUTRAL* and *NONE* as one class, and *AGAINST* as another class), the agreement is `kappa=0.5419`
#### Who are the annotators?
[Members of the #WebImmunization project](https://webimmunization.cm-uj.krakow.pl/en/team/)
### Personal and Sensitive Information
According to the Twitter developer policy, if displayed content ceases to be available through the Twitter API, it can not be obtained from other sources. Thus, we provide tweets' ids to maintain the integrity of all Twitter content with Twitter service. The proper way to extract tweets' content is via Twitter API. Whenever Twitter decided to suspend the author of the tweet, or the author decides to delete their tweet it won't be possible to obtain the tweet's content with this dataset.
## Considerations for Using the Data
### Social Impact of Dataset
The COVID-19 is a serious global health threat that can be mitigated only by public health interventions that require massive participation. Mass vaccination against COVID-19 is one of the most effective and economically promising solutions to stop the spread of the Sars-Cov-2 virus, which is responsible for the pandemic. Understanding how misinformation about COVID-19 vaccines is spreading in one of the globally most important social networks is paramount.
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
#### Interannotator agreement
According to a popular interpretation of Fleiss' kappa <cite>[Landis 1977][2]</cite>, the annotators are in fair agreement in the first three scenarios and moderate agreement in the last scenario. These results suggest that the annotators are struggling to distinguish between *PRO* and *NEUTRAL* classes, and sometimes they have divergent opinions on whether the tweet should be rejected from training. Still, they are coherent when labeling *AGAINST* tweets.
#### Suspended account & deleted tweets
Some of the statuses from the dataset can not be obtained due to account suspension or tweet deletion. The last time we check (15th of November, 2021), about 12% of tweets were authored by suspended accounts and about 10% were already deleted.
### Dataset Curators
Agata Olejniuk
Poznan University of Technology, Poland
The research leading to these results has received funding from the EEA Financial Mechanism 2014-2021. Project registration number: 2019/35/J/HS6 /03498.
### Licensing Information
[Needs More Information]
### Citation Information
```
@inproceedings{krysinska2021careful,
title={Be Careful Who You Follow: The Impact of the Initial Set of Friends on COVID-19 Vaccine Tweets},
author={Krysi{\'n}ska, Izabela and W{\'o}jtowicz, Tomi and Olejniuk, Agata and Morzy, Miko{\l}aj and Piasecki, Jan},
booktitle={Proceedings of the 2021 Workshop on Open Challenges in Online Social Networks},
pages={1--8},
year={2021}
}
```
[DOI](https://doi.org/10.1145/3472720.3483619)
### Contributions
We would like to cordially thank the [members of the #WebImmunization project](https://webimmunization.cm-uj.krakow.pl/en/team/) for helping with data annotation.
## References
[1]: Joseph L Fleiss. Measuring nominal scale agreement among many raters.Psychological bulletin, 76(5):378, 1971.
[2]: J Richard Landis and Gary G Koch. The measurement of observer agreement for categorical data. biometrics, pages 159–174, 1977. |
BangumiBase/senkizesshousymphogear | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Senki Zesshou Symphogear
This is the image base of bangumi Senki Zesshou Symphogear, we detected 71 characters, 8992 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 416 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 43 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 73 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 52 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 46 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 36 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 28 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 216 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 56 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 125 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 106 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 22 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 20 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 33 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 194 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 85 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 26 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 15 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 577 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 98 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 583 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 62 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 491 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 65 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 78 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 264 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 45 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 14 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 336 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 294 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 138 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 102 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 23 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 25 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 376 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 40 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 20 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 29 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 545 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 39 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 68 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 44 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 75 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 19 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 27 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 55 | [Download](45/dataset.zip) |  |  |  |  |  |  |  |  |
| 46 | 447 | [Download](46/dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 218 | [Download](47/dataset.zip) |  |  |  |  |  |  |  |  |
| 48 | 81 | [Download](48/dataset.zip) |  |  |  |  |  |  |  |  |
| 49 | 64 | [Download](49/dataset.zip) |  |  |  |  |  |  |  |  |
| 50 | 7 | [Download](50/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 51 | 64 | [Download](51/dataset.zip) |  |  |  |  |  |  |  |  |
| 52 | 29 | [Download](52/dataset.zip) |  |  |  |  |  |  |  |  |
| 53 | 13 | [Download](53/dataset.zip) |  |  |  |  |  |  |  |  |
| 54 | 10 | [Download](54/dataset.zip) |  |  |  |  |  |  |  |  |
| 55 | 7 | [Download](55/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 56 | 22 | [Download](56/dataset.zip) |  |  |  |  |  |  |  |  |
| 57 | 58 | [Download](57/dataset.zip) |  |  |  |  |  |  |  |  |
| 58 | 36 | [Download](58/dataset.zip) |  |  |  |  |  |  |  |  |
| 59 | 10 | [Download](59/dataset.zip) |  |  |  |  |  |  |  |  |
| 60 | 18 | [Download](60/dataset.zip) |  |  |  |  |  |  |  |  |
| 61 | 59 | [Download](61/dataset.zip) |  |  |  |  |  |  |  |  |
| 62 | 8 | [Download](62/dataset.zip) |  |  |  |  |  |  |  |  |
| 63 | 42 | [Download](63/dataset.zip) |  |  |  |  |  |  |  |  |
| 64 | 9 | [Download](64/dataset.zip) |  |  |  |  |  |  |  |  |
| 65 | 37 | [Download](65/dataset.zip) |  |  |  |  |  |  |  |  |
| 66 | 10 | [Download](66/dataset.zip) |  |  |  |  |  |  |  |  |
| 67 | 12 | [Download](67/dataset.zip) |  |  |  |  |  |  |  |  |
| 68 | 6 | [Download](68/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 69 | 56 | [Download](69/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 1545 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
AdaptLLM/med_knowledge_prob | ---
configs:
- config_name: Anaesthesia
data_files:
- split: test
path: Anaesthesia.jsonl
- config_name: Anatomy
data_files:
- split: test
path: Anatomy.jsonl
- config_name: Biochemistry
data_files:
- split: test
path: Biochemistry.jsonl
- config_name: Dental
data_files:
- split: test
path: Dental.jsonl
- config_name: ENT
data_files:
- split: test
path: ENT.jsonl
- config_name: Forensic Medicine
data_files:
- split: test
path: Forensic Medicine.jsonl
- config_name: Gynaecology & Obstetrics
data_files:
- split: test
path: Gynaecology & Obstetrics.jsonl
- config_name: Medicine
data_files:
- split: test
path: Medicine.jsonl
- config_name: Microbiology
data_files:
- split: test
path: Microbiology.jsonl
- config_name: Ophthalmology
data_files:
- split: test
path: Ophthalmology.jsonl
- config_name: Orthopedics
data_files:
- split: test
path: Orthopedics.jsonl
- config_name: Pathology
data_files:
- split: test
path: Pathology.jsonl
- config_name: Pediatrics
data_files:
- split: test
path: Pediatrics.jsonl
- config_name: Pharmacology
data_files:
- split: test
path: Pharmacology.jsonl
- config_name: Physiology
data_files:
- split: test
path: Physiology.jsonl
- config_name: Psychiatry
data_files:
- split: test
path: Psychiatry.jsonl
- config_name: Radiology
data_files:
- split: test
path: Radiology.jsonl
- config_name: Skin
data_files:
- split: test
path: Skin.jsonl
- config_name: Social & Preventive Medicine
data_files:
- split: test
path: Social & Preventive Medicine.jsonl
- config_name: Surgery
data_files:
- split: test
path: Surgery.jsonl
- config_name: Unknown
data_files:
- split: test
path: Unknown.jsonl
task_categories:
- text-classification
- question-answering
- zero-shot-classification
language:
- en
tags:
- medical
- chemistry
- biology
---
# Domain Adaptation of Large Language Models
This repo contains the **Biomedicine Knowledge Probing dataset** used in our **ICLR 2024** paper [Adapting Large Language Models via Reading Comprehension](https://huggingface.co/papers/2309.09530).
We explore **continued pre-training on domain-specific corpora** for large language models. While this approach enriches LLMs with domain knowledge, it significantly hurts their prompting ability for question answering. Inspired by human learning via reading comprehension, we propose a simple method to **transform large-scale pre-training corpora into reading comprehension texts**, consistently improving prompting performance across tasks in biomedicine, finance, and law domains. **Our 7B model competes with much larger domain-specific models like BloombergGPT-50B**.
### 🤗 We are currently working hard on developing models across different domains, scales and architectures! Please stay tuned! 🤗
**************************** **Updates** ****************************
* 2024/4/14: Released the knowledge probing datasets at [med_knowledge_prob](https://huggingface.co/datasets/AdaptLLM/med_knowledge_prob) and [law_knowledge_prob](https://huggingface.co/datasets/AdaptLLM/law_knowledge_prob)
* 2024/4/2: Released the raw data splits (train and test) of all the evaluation datasets
* 2024/1/16: 🎉 Our [research paper](https://huggingface.co/papers/2309.09530) has been accepted by ICLR 2024!!!🎉
* 2023/12/19: Released our [13B base models](https://huggingface.co/AdaptLLM/law-LLM-13B) developed from LLaMA-1-13B.
* 2023/12/8: Released our [chat models](https://huggingface.co/AdaptLLM/law-chat) developed from LLaMA-2-Chat-7B.
* 2023/9/18: Released our [paper](https://huggingface.co/papers/2309.09530), [code](https://github.com/microsoft/LMOps), [data](https://huggingface.co/datasets/AdaptLLM/law-tasks), and [base models](https://huggingface.co/AdaptLLM/law-LLM) developed from LLaMA-1-7B.
## Domain-Specific LLMs
### LLaMA-1-7B
In our paper, we develop three domain-specific models from LLaMA-1-7B, which are also available in Huggingface: [Biomedicine-LLM](https://huggingface.co/AdaptLLM/medicine-LLM), [Finance-LLM](https://huggingface.co/AdaptLLM/finance-LLM) and [Law-LLM](https://huggingface.co/AdaptLLM/law-LLM), the performances of our AdaptLLM compared to other domain-specific LLMs are:
<p align='center'>
<img src="https://cdn-uploads.huggingface.co/production/uploads/650801ced5578ef7e20b33d4/6efPwitFgy-pLTzvccdcP.png" width="700">
</p>
### LLaMA-1-13B
Moreover, we scale up our base model to LLaMA-1-13B to see if **our method is similarly effective for larger-scale models**, and the results are consistently positive too: [Biomedicine-LLM-13B](https://huggingface.co/AdaptLLM/medicine-LLM-13B), [Finance-LLM-13B](https://huggingface.co/AdaptLLM/finance-LLM-13B) and [Law-LLM-13B](https://huggingface.co/AdaptLLM/law-LLM-13B).
## Domain-Specific LLaMA-2-Chat
Our method is also effective for aligned models! LLaMA-2-Chat requires a [specific data format](https://huggingface.co/blog/llama2#how-to-prompt-llama-2), and our **reading comprehension can perfectly fit the data format** by transforming the reading comprehension into a multi-turn conversation. We have also open-sourced chat models in different domains: [Biomedicine-Chat](https://huggingface.co/AdaptLLM/medicine-chat), [Finance-Chat](https://huggingface.co/AdaptLLM/finance-chat) and [Law-Chat](https://huggingface.co/AdaptLLM/law-chat)
## Domain-Specific Tasks
### Pre-templatized/Formatted Testing Splits
To easily reproduce our prompting results, we have uploaded the filled-in zero/few-shot input instructions and output completions of the test each domain-specific task: [biomedicine-tasks](https://huggingface.co/datasets/AdaptLLM/medicine-tasks), [finance-tasks](https://huggingface.co/datasets/AdaptLLM/finance-tasks), and [law-tasks](https://huggingface.co/datasets/AdaptLLM/law-tasks).
**Note:** those filled-in instructions are specifically tailored for models before alignment and do NOT fit for the specific data format required for chat models.
### Raw Datasets
We have also uploaded the raw training and testing splits, for facilitating fine-tuning or other usages: [ChemProt](https://huggingface.co/datasets/AdaptLLM/ChemProt), [RCT](https://huggingface.co/datasets/AdaptLLM/RCT), [ConvFinQA](https://huggingface.co/datasets/AdaptLLM/ConvFinQA), [FiQA_SA](https://huggingface.co/datasets/AdaptLLM/FiQA_SA), [Headline](https://huggingface.co/datasets/AdaptLLM/Headline), [NER](https://huggingface.co/datasets/AdaptLLM/NER), [FPB](https://huggingface.co/datasets/AdaptLLM/FPB)
The other datasets used in our paper have already been available in huggingface.
### Domain Knowledge Probing
Our pre-processed knowledge probing datasets are available at: [med_knowledge_prob](https://huggingface.co/datasets/AdaptLLM/med_knowledge_prob) and [law_knowledge_prob](https://huggingface.co/datasets/AdaptLLM/law_knowledge_prob)
## Citation
If you find our work helpful, please cite us:
```bibtex
@inproceedings{
cheng2024adapting,
title={Adapting Large Language Models via Reading Comprehension},
author={Daixuan Cheng and Shaohan Huang and Furu Wei},
booktitle={The Twelfth International Conference on Learning Representations},
year={2024},
url={https://openreview.net/forum?id=y886UXPEZ0}
}
```
and the original dataset:
```bibtex
@inproceedings{MedMCQA,
author = {Ankit Pal and
Logesh Kumar Umapathi and
Malaikannan Sankarasubbu},
title = {MedMCQA: {A} Large-scale Multi-Subject Multi-Choice Dataset for Medical
domain Question Answering},
booktitle = {{CHIL}},
series = {Proceedings of Machine Learning Research},
volume = {174},
pages = {248--260},
publisher = {{PMLR}},
year = {2022}
}
```
|
anthony2261/paddy-disease-classification | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': bacterial_leaf_blight
'1': bacterial_leaf_streak
'2': bacterial_panicle_blight
'3': blast
'4': brown_spot
'5': dead_heart
'6': downy_mildew
'7': hispa
'8': normal
'9': tungro
- name: variety
dtype:
class_label:
names:
'0': ADT45
'1': IR20
'2': KarnatakaPonni
'3': Onthanel
'4': Ponni
'5': Surya
'6': Zonal
'7': AndraPonni
'8': AtchayaPonni
'9': RR
- name: age
dtype: int16
splits:
- name: train
num_bytes: 834127756.552
num_examples: 10407
download_size: 816344863
dataset_size: 834127756.552
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- image-classification
tags:
- biology
- medical
pretty_name: Paddy Disease Classification
size_categories:
- 1K<n<10K
---
# Dataset Card for "paddy-disease-classification"
Taken from the Paddy Doctor Kaggle [Competition](https://www.kaggle.com/competitions/paddy-disease-classification/)
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jxu9001/custom_ontonotes5_v2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: tokens
sequence: string
- name: tags
sequence: int64
splits:
- name: train
num_bytes: 5380693
num_examples: 12195
- name: validation
num_bytes: 684113
num_examples: 1553
- name: test
num_bytes: 687020
num_examples: 1573
download_size: 1374069
dataset_size: 6751826
---
# Dataset Card for "custom_ontonotes5_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ericyu/CNAMCD_Cropped | ---
dataset_info:
features:
- name: imageA
dtype: image
- name: imageB
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 144417263.7991308
num_examples: 6019
- name: test
num_bytes: 48131017.86226156
num_examples: 2006
- name: val
num_bytes: 48155065.57960765
num_examples: 2007
download_size: 239525611
dataset_size: 240703347.241
---
# Dataset Card for "CNAMCD_Cropped"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
weaviate/WithoutRetrieval-SchemaSplit-Train-40 | ---
license: apache-2.0
---
|
phyloforfun/HLT_Kew_WCVP_SLTPvA_v1-0_medium__T20-OCR-C25-L25-E50-R10 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 10461011
num_examples: 10000
download_size: 1500313
dataset_size: 10461011
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
larryvrh/ShareGPT-Zh_Only | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: src
dtype: string
splits:
- name: train
num_bytes: 69835231
num_examples: 8631
download_size: 32862465
dataset_size: 69835231
task_categories:
- text-generation
- conversational
language:
- zh
size_categories:
- 1K<n<10K
---
# Dataset Card for "sharegpt"
Combined and filtered from [shibing624/sharegpt_gpt4](https://huggingface.co/datasets/shibing624/sharegpt_gpt4) and [zetavg/ShareGPT-Processed](https://huggingface.co/datasets/zetavg/ShareGPT-Processed). |
Michaelkassouf/FERRARI_SD | ---
dataset_info:
features:
- name: image
dtype: string
- name: caption
dtype: string
splits:
- name: train
num_bytes: 3495120
num_examples: 35553
download_size: 1051219
dataset_size: 3495120
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
seansullivan/bershka | ---
license: mit
---
|
PanoEvJ/job_postings_GPT | ---
dataset_info:
features:
- name: job_postings
dtype: string
- name: cover_letters
dtype: string
splits:
- name: train
num_bytes: 1242482
num_examples: 297
download_size: 517424
dataset_size: 1242482
---
# Dataset Card for "job_postings_GPT"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jfloresf/aeqd_gpkg | ---
license: mit
---
|
joey234/mmlu-human_sexuality | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: fewshot_context_neg
dtype: string
splits:
- name: dev
num_bytes: 4162
num_examples: 5
- name: test
num_bytes: 399119
num_examples: 131
download_size: 77080
dataset_size: 403281
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
# Dataset Card for "mmlu-human_sexuality"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
josemauricinho/myjobs | ---
license: openrail
---
|
jjmachan/NSFW-questions | ---
license: apache-2.0
dataset_info:
features:
- name: title
dtype: string
- name: subreddit
dtype: string
- name: post_id
dtype: string
- name: score
dtype: int64
- name: link_flair_text
dtype: string
- name: is_self
dtype: bool
- name: over_18
dtype: bool
- name: upvote_ratio
dtype: float64
- name: is_question
dtype: bool
- name: C1
dtype: string
- name: C2
dtype: string
- name: C3
dtype: string
- name: C4
dtype: string
- name: C5
dtype: string
splits:
- name: train
num_bytes: 1541472
num_examples: 1442
download_size: 904939
dataset_size: 1541472
---
|
McSpicyWithMilo/target-elements-0.2split-new-add-validation | ---
dataset_info:
features:
- name: target_element
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 14779.2
num_examples: 144
- name: test
num_bytes: 1847.4
num_examples: 18
- name: valid
num_bytes: 1847.4
num_examples: 18
download_size: 16088
dataset_size: 18474.000000000004
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
# Dataset Card for "target-elements-0.2split-new-add-validation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yjok0220/security_guide | ---
license: apache-2.0
---
|
liuyanchen1015/VALUE_mrpc_null_relcl | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 56055
num_examples: 196
- name: train
num_bytes: 128874
num_examples: 447
- name: validation
num_bytes: 12223
num_examples: 44
download_size: 140882
dataset_size: 197152
---
# Dataset Card for "VALUE_mrpc_null_relcl"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yuan-sf63/word_mask_D_72 | ---
dataset_info:
features:
- name: feature
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 32554740.882045604
num_examples: 201540
- name: validation
num_bytes: 3617301.117954397
num_examples: 22394
download_size: 26263345
dataset_size: 36172042.0
---
# Dataset Card for "word_mask_D_72"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nataliaElv/similarity-qa-no-vectors | ---
size_categories: 1K<n<10K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for similarity-qa-no-vectors
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("nataliaElv/similarity-qa-no-vectors")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("nataliaElv/similarity-qa-no-vectors")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/conceptual_guides/data_model.html#feedback-dataset) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, **metadata**, **vectors**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| instruction | Instruction | text | True | False |
| input | Input | text | False | False |
| output | Output | text | True | False |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label_selection, multi_label_selection, or ranking.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| quality | Rate the quality of the record: | rating | True | N/A | [1, 2, 3, 4, 5] |
| explanation | Explain your rating: | text | True | N/A | N/A |
The **suggestions** are human or machine generated recommendations for each question to assist the annotator during the annotation process, so those are always linked to the existing questions, and named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above, but the column name is appended with "-suggestion" and the metadata is appended with "-suggestion-metadata".
The **metadata** is a dictionary that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
**✨ NEW** The **vectors** are different columns that contain a vector in floating point, which is constraint to the pre-defined dimensions in the **vectors_settings** when configuring the vectors within the dataset itself, also the dimensions will always be 1-dimensional. The **vectors** are optional and identified by the pre-defined vector name in the dataset configuration file in `argilla.yaml`.
| Vector Name | Title | Dimensions |
|-------------|-------|------------|
| input | Input | [1, 384] |
| instruction | Instruction | [1, 384] |
| output | Output | [1, 384] |
| testing | EMPTY! | [1, 1] |
| Metadata Name | Title | Type | Values | Visible for Annotators |
| ------------- | ----- | ---- | ------ | ---------------------- |
| text_length | text_length | integer | None - None | True |
The **guidelines**, are optional as well, and are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"external_id": null,
"fields": {
"input": "",
"instruction": "Give three tips for staying healthy.",
"output": "1. Eat a balanced diet and make sure to include plenty of fruits and vegetables. \n2. Exercise regularly to keep your body active and strong. \n3. Get enough sleep and maintain a consistent sleep schedule."
},
"metadata": {
"text_length": 241
},
"responses": [],
"suggestions": [],
"vectors": {}
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"explanation": [],
"explanation-suggestion": null,
"explanation-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"external_id": null,
"input": "",
"instruction": "Give three tips for staying healthy.",
"metadata": "{\"text_length\": 241}",
"output": "1. Eat a balanced diet and make sure to include plenty of fruits and vegetables. \n2. Exercise regularly to keep your body active and strong. \n3. Get enough sleep and maintain a consistent sleep schedule.",
"quality": [],
"quality-suggestion": null,
"quality-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"vectors": {
"input": null,
"instruction": null,
"output": null,
"testing": null
}
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
* **instruction** is of type `text`.
* (optional) **input** is of type `text`.
* **output** is of type `text`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **quality** is of type `rating` with the following allowed values [1, 2, 3, 4, 5].
* **explanation** is of type `text`.
* **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **quality-suggestion** is of type `rating` with the following allowed values [1, 2, 3, 4, 5].
* (optional) **explanation-suggestion** is of type `text`.
* **✨ NEW** **Vectors**: As of Argilla 1.19.0, the vectors have been included in order to add support for similarity search to explore similar records based on vector search powered by the search engine defined. The vectors are optional and cannot be seen within the UI, those are uploaded and internally used. Also the vectors will always be optional, and only the dimensions previously defined in their settings.
* (optional) **input** is of type `float32` and has a dimension of (1, `384`).
* (optional) **instruction** is of type `float32` and has a dimension of (1, `384`).
* (optional) **output** is of type `float32` and has a dimension of (1, `384`).
* (optional) **testing** is of type `float32` and has a dimension of (1, `1`).
Additionally, we also have two more fields that are optional and are the following:
* **metadata:** This is an optional field that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-source-metrics/pip-external | ---
dataset_info:
features:
- name: day
dtype: string
- name: num_downloads
dtype: int64
splits:
- name: langchain
num_bytes: 11770
num_examples: 535
- name: pytorch
num_bytes: 36850
num_examples: 1675
- name: tensorflow
num_bytes: 36850
num_examples: 1675
- name: openai
num_bytes: 4840
num_examples: 220
download_size: 53996
dataset_size: 90310
configs:
- config_name: default
data_files:
- split: langchain
path: data/langchain-*
- split: pytorch
path: data/pytorch-*
- split: tensorflow
path: data/tensorflow-*
- split: openai
path: data/openai-*
---
# Dataset Card for "pip-external"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/python3-standardized_cluster_6_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 8645613
num_examples: 5678
download_size: 0
dataset_size: 8645613
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "python3-standardized_cluster_6_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/klin_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of klin/KLIN/KLIN (Girls' Frontline)
This is the dataset of klin/KLIN/KLIN (Girls' Frontline), containing 65 images and their tags.
The core tags of this character are `dark_skin, dark-skinned_female, grey_hair, hair_ornament, hair_ribbon, ribbon, hair_between_eyes, bangs, green_eyes, ponytail, breasts, yellow_ribbon, small_breasts, v-shaped_eyebrows`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 65 | 74.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/klin_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 65 | 44.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/klin_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 168 | 99.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/klin_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 65 | 66.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/klin_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 168 | 134.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/klin_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/klin_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 47 |  |  |  |  |  | 1girl, solo, detached_sleeves, navel, bandana, looking_at_viewer, black_thighhighs, hood, simple_background, open_mouth, sleeveless_jacket, short_shorts, white_background, black_sleeves, single_thighhigh, thigh_strap, crop_top, holding, open_shorts, bare_shoulders, bright_pupils, ahoge, brown_shorts, submachine_gun |
| 1 | 9 |  |  |  |  |  | 1girl, solo, long_sleeves, looking_at_viewer, obi, open_mouth, wide_sleeves, yellow_kimono, official_alternate_costume, asymmetrical_bangs, standing, yellow_eyes, black_kimono, fang, full_body, holding_fan, paper_fan, pouch, short_twintails, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | detached_sleeves | navel | bandana | looking_at_viewer | black_thighhighs | hood | simple_background | open_mouth | sleeveless_jacket | short_shorts | white_background | black_sleeves | single_thighhigh | thigh_strap | crop_top | holding | open_shorts | bare_shoulders | bright_pupils | ahoge | brown_shorts | submachine_gun | long_sleeves | obi | wide_sleeves | yellow_kimono | official_alternate_costume | asymmetrical_bangs | standing | yellow_eyes | black_kimono | fang | full_body | holding_fan | paper_fan | pouch | short_twintails | smile |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-------------------|:--------|:----------|:--------------------|:-------------------|:-------|:--------------------|:-------------|:--------------------|:---------------|:-------------------|:----------------|:-------------------|:--------------|:-----------|:----------|:--------------|:-----------------|:----------------|:--------|:---------------|:-----------------|:---------------|:------|:---------------|:----------------|:-----------------------------|:---------------------|:-----------|:--------------|:---------------|:-------|:------------|:--------------|:------------|:--------|:------------------|:--------|
| 0 | 47 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | | | | X | | | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
cakiki/css_paths | ---
dataset_info:
features:
- name: repository_name
dtype: string
splits:
- name: train
num_bytes: 158651499
num_examples: 5726933
download_size: 138140586
dataset_size: 158651499
---
# Dataset Card for "css_paths"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
WebraftAI/unilm-v0.1-16k | ---
license: cc-by-nc-sa-4.0
---
|
itamarcard/cristiane | ---
license: openrail
---
|
enoahjr/twitter_dataset_1713141903 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 88944
num_examples: 208
download_size: 30058
dataset_size: 88944
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
amuvarma/crema-unclean | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: label
dtype:
class_label:
names:
'0': anger
'1': disgust
'2': fear
'3': happy
'4': neutral
'5': sad
splits:
- name: train
num_bytes: 424198531.0
num_examples: 5209
- name: validation
num_bytes: 90646354.0
num_examples: 1116
- name: test
num_bytes: 91144354.0
num_examples: 1117
download_size: 605899055
dataset_size: 605989239.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
legoguy2424/MrBeast | ---
license: unknown
---
|
jianguo/jianguo-1234 | ---
license: openrail
---
|
lmqg/qg_itquad | ---
license: cc-by-4.0
pretty_name: SQuAD-it for question generation
language: it
multilinguality: monolingual
size_categories: 10K<n<100K
source_datasets: squad_es
task_categories:
- text-generation
task_ids:
- language-modeling
tags:
- question-generation
---
# Dataset Card for "lmqg/qg_itquad"
## Dataset Description
- **Repository:** [https://github.com/asahi417/lm-question-generation](https://github.com/asahi417/lm-question-generation)
- **Paper:** [https://arxiv.org/abs/2210.03992](https://arxiv.org/abs/2210.03992)
- **Point of Contact:** [Asahi Ushio](http://asahiushio.com/)
### Dataset Summary
This is a subset of [QG-Bench](https://github.com/asahi417/lm-question-generation/blob/master/QG_BENCH.md#datasets), a unified question generation benchmark proposed in
["Generative Language Models for Paragraph-Level Question Generation: A Unified Benchmark and Evaluation, EMNLP 2022 main conference"](https://arxiv.org/abs/2210.03992).
This is a modified version of [SQuAD-it](https://huggingface.co/datasets/squad_it) for question generation (QG) task.
Since the original dataset only contains training/validation set, we manually sample test set from training set, which
has no overlap in terms of the paragraph with the training set.
### Supported Tasks and Leaderboards
* `question-generation`: The dataset is assumed to be used to train a model for question generation.
Success on this task is typically measured by achieving a high BLEU4/METEOR/ROUGE-L/BERTScore/MoverScore (see our paper for more in detail).
### Languages
Italian (it)
## Dataset Structure
An example of 'train' looks as follows.
```
{
'answer': 'Carlo III',
'question': "Il figlio di chi è morto sulla strada per Palermo e vi è sepolto?",
'sentence': 'Carlo III scelse Palermo per la sua incoronazione come Re di Sicilia.',
'paragraph': 'Dopo il trattato di Utrecht (1713), la Sicilia fu consegnata ai Savoia, ma nel 1734 fu nuovamente posseduta dai...',
'sentence_answer': '<hl> Carlo III <hl> scelse Palermo per la sua incoronazione come Re di Sicilia.',
'paragraph_answer': "Dopo il trattato di Utrecht (1713), la Sicilia fu consegnata ai Savoia, ma nel 1734 fu nuovamente posseduta dai borbonici. <hl> Carlo III <hl> scelse Palermo per la sua incoronazione come Re di Sicilia. Charles fece costruire nuove case per la popolazione in crescita, mentre il commercio e l' industria crebbero. Tuttavia, ormai Palermo era ora solo un' altra città provinciale, dato che la Corte Reale risiedeva a Napoli. Il figlio di Carlo Ferdinando, anche se non gradito dalla popolazione, si rifugiò a Palermo dopo la Rivoluzione francese del 1798. Suo figlio Alberto è morto sulla strada per Palermo ed è sepolto in città. Quando fu fondato il Regno delle Due Sicilie, la capitale originaria era Palermo (1816) ma un anno dopo si trasferì a Napoli.",
'paragraph_sentence': "Dopo il trattato di Utrecht (1713), la Sicilia fu consegnata ai Savoia, ma nel 1734 fu nuovamente posseduta dai borbonici. <hl> Carlo III scelse Palermo per la sua incoronazione come Re di Sicilia. <hl> Charles fece costruire nuove case per la popolazione in crescita, mentre il commercio e l' industria crebbero. Tuttavia, ormai Palermo era ora solo un' altra città provinciale, dato che la Corte Reale risiedeva a Napoli. Il figlio di Carlo Ferdinando, anche se non gradito dalla popolazione, si rifugiò a Palermo dopo la Rivoluzione francese del 1798. Suo figlio Alberto è morto sulla strada per Palermo ed è sepolto in città. Quando fu fondato il Regno delle Due Sicilie, la capitale originaria era Palermo (1816) ma un anno dopo si trasferì a Napoli."
}
```
The data fields are the same among all splits.
- `question`: a `string` feature.
- `paragraph`: a `string` feature.
- `answer`: a `string` feature.
- `sentence`: a `string` feature.
- `paragraph_answer`: a `string` feature, which is same as the paragraph but the answer is highlighted by a special token `<hl>`.
- `paragraph_sentence`: a `string` feature, which is same as the paragraph but a sentence containing the answer is highlighted by a special token `<hl>`.
- `sentence_answer`: a `string` feature, which is same as the sentence but the answer is highlighted by a special token `<hl>`.
Each of `paragraph_answer`, `paragraph_sentence`, and `sentence_answer` feature is assumed to be used to train a question generation model,
but with different information. The `paragraph_answer` and `sentence_answer` features are for answer-aware question generation and
`paragraph_sentence` feature is for sentence-aware question generation.
## Data Splits
|train|validation|test |
|----:|---------:|----:|
|46550| 7609 |7609|
## Citation Information
```
@inproceedings{ushio-etal-2022-generative,
title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration",
author = "Ushio, Asahi and
Alva-Manchego, Fernando and
Camacho-Collados, Jose",
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2022",
address = "Abu Dhabi, U.A.E.",
publisher = "Association for Computational Linguistics",
}
``` |
Mitsuki-Sakamoto/alpaca_farm-reward-model-deberta-v3-large-v2-re-preference-64-nsample-8_random | ---
dataset_info:
- config_name: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
splits:
- name: preference
num_bytes: 25769164
num_examples: 20001
download_size: 12288408
dataset_size: 25769164
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
splits:
- name: preference
num_bytes: 25841710
num_examples: 20001
download_size: 12249297
dataset_size: 25841710
- config_name: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
splits:
- name: preference
num_bytes: 25739637
num_examples: 20001
download_size: 11961077
dataset_size: 25739637
configs:
- config_name: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500
data_files:
- split: preference
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/preference-*
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: preference
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/preference-*
- config_name: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: preference
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/preference-*
---
|
open-llm-leaderboard/details_42dot__42dot_LLM-PLM-1.3B | ---
pretty_name: Evaluation run of 42dot/42dot_LLM-PLM-1.3B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [42dot/42dot_LLM-PLM-1.3B](https://huggingface.co/42dot/42dot_LLM-PLM-1.3B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_42dot__42dot_LLM-PLM-1.3B_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-15T08:12:34.029868](https://huggingface.co/datasets/open-llm-leaderboard/details_42dot__42dot_LLM-PLM-1.3B_public/blob/main/results_2023-11-15T08-12-34.029868.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2748396034833794,\n\
\ \"acc_stderr\": 0.03133274597965432,\n \"acc_norm\": 0.2767290148254369,\n\
\ \"acc_norm_stderr\": 0.032124763692846635,\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931578,\n \"mc2\": 0.38680931810418795,\n\
\ \"mc2_stderr\": 0.013939564847231014,\n \"em\": 0.001153523489932886,\n\
\ \"em_stderr\": 0.0003476179896857114,\n \"f1\": 0.04587562919463095,\n\
\ \"f1_stderr\": 0.0011468980714363175\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.30119453924914674,\n \"acc_stderr\": 0.013406741767847627,\n\
\ \"acc_norm\": 0.3242320819112628,\n \"acc_norm_stderr\": 0.01367881039951882\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4287990440151364,\n\
\ \"acc_stderr\": 0.0049389301432344514,\n \"acc_norm\": 0.563931487751444,\n\
\ \"acc_norm_stderr\": 0.004948824501355477\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n\
\ \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n\
\ \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.041633319989322716,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.041633319989322716\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2679245283018868,\n \"acc_stderr\": 0.027257260322494845,\n\
\ \"acc_norm\": 0.2679245283018868,\n \"acc_norm_stderr\": 0.027257260322494845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n\
\ \"acc_stderr\": 0.03716177437566016,\n \"acc_norm\": 0.2708333333333333,\n\
\ \"acc_norm_stderr\": 0.03716177437566016\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.39,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.27167630057803466,\n\
\ \"acc_stderr\": 0.03391750322321659,\n \"acc_norm\": 0.27167630057803466,\n\
\ \"acc_norm_stderr\": 0.03391750322321659\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.03793281185307809,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.03793281185307809\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.31063829787234043,\n \"acc_stderr\": 0.03025123757921317,\n\
\ \"acc_norm\": 0.31063829787234043,\n \"acc_norm_stderr\": 0.03025123757921317\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.18620689655172415,\n \"acc_stderr\": 0.03243946159004616,\n\
\ \"acc_norm\": 0.18620689655172415,\n \"acc_norm_stderr\": 0.03243946159004616\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24074074074074073,\n \"acc_stderr\": 0.0220190800122179,\n \"\
acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.0220190800122179\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.038522733649243156,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.038522733649243156\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.22580645161290322,\n\
\ \"acc_stderr\": 0.02378557788418101,\n \"acc_norm\": 0.22580645161290322,\n\
\ \"acc_norm_stderr\": 0.02378557788418101\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.270935960591133,\n \"acc_stderr\": 0.031270907132976984,\n\
\ \"acc_norm\": 0.270935960591133,\n \"acc_norm_stderr\": 0.031270907132976984\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2222222222222222,\n \"acc_stderr\": 0.02962022787479047,\n \"\
acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02962022787479047\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3471502590673575,\n \"acc_stderr\": 0.034356961683613546,\n\
\ \"acc_norm\": 0.3471502590673575,\n \"acc_norm_stderr\": 0.034356961683613546\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.34615384615384615,\n \"acc_stderr\": 0.024121125416941183,\n\
\ \"acc_norm\": 0.34615384615384615,\n \"acc_norm_stderr\": 0.024121125416941183\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230175,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230175\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.29831932773109243,\n \"acc_stderr\": 0.02971914287634286,\n\
\ \"acc_norm\": 0.29831932773109243,\n \"acc_norm_stderr\": 0.02971914287634286\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.26055045871559634,\n \"acc_stderr\": 0.01881918203485007,\n \"\
acc_norm\": 0.26055045871559634,\n \"acc_norm_stderr\": 0.01881918203485007\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.47685185185185186,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2647058823529412,\n \"acc_stderr\": 0.0309645179269234,\n \"acc_norm\"\
: 0.2647058823529412,\n \"acc_norm_stderr\": 0.0309645179269234\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.23628691983122363,\n \"acc_stderr\": 0.027652153144159263,\n \"\
acc_norm\": 0.23628691983122363,\n \"acc_norm_stderr\": 0.027652153144159263\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.25112107623318386,\n\
\ \"acc_stderr\": 0.029105220833224622,\n \"acc_norm\": 0.25112107623318386,\n\
\ \"acc_norm_stderr\": 0.029105220833224622\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.20610687022900764,\n \"acc_stderr\": 0.03547771004159463,\n\
\ \"acc_norm\": 0.20610687022900764,\n \"acc_norm_stderr\": 0.03547771004159463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.3884297520661157,\n \"acc_stderr\": 0.04449270350068382,\n \"\
acc_norm\": 0.3884297520661157,\n \"acc_norm_stderr\": 0.04449270350068382\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2037037037037037,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.2037037037037037,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26993865030674846,\n \"acc_stderr\": 0.034878251684978906,\n\
\ \"acc_norm\": 0.26993865030674846,\n \"acc_norm_stderr\": 0.034878251684978906\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285713,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285713\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.28205128205128205,\n\
\ \"acc_stderr\": 0.02948036054954119,\n \"acc_norm\": 0.28205128205128205,\n\
\ \"acc_norm_stderr\": 0.02948036054954119\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23243933588761176,\n\
\ \"acc_stderr\": 0.015104550008905706,\n \"acc_norm\": 0.23243933588761176,\n\
\ \"acc_norm_stderr\": 0.015104550008905706\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.022075709251757177,\n\
\ \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.022075709251757177\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2681564245810056,\n\
\ \"acc_stderr\": 0.014816119635317003,\n \"acc_norm\": 0.2681564245810056,\n\
\ \"acc_norm_stderr\": 0.014816119635317003\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.025829163272757485,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.025829163272757485\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3183279742765273,\n\
\ \"acc_stderr\": 0.02645722506781102,\n \"acc_norm\": 0.3183279742765273,\n\
\ \"acc_norm_stderr\": 0.02645722506781102\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.024659685185967277,\n\
\ \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.024659685185967277\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2872340425531915,\n \"acc_stderr\": 0.026992199173064356,\n \
\ \"acc_norm\": 0.2872340425531915,\n \"acc_norm_stderr\": 0.026992199173064356\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2392438070404172,\n\
\ \"acc_stderr\": 0.010896123652676646,\n \"acc_norm\": 0.2392438070404172,\n\
\ \"acc_norm_stderr\": 0.010896123652676646\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.030187532060329376,\n\
\ \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.030187532060329376\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2565359477124183,\n \"acc_stderr\": 0.017667841612378984,\n \
\ \"acc_norm\": 0.2565359477124183,\n \"acc_norm_stderr\": 0.017667841612378984\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.24545454545454545,\n\
\ \"acc_stderr\": 0.041220665028782855,\n \"acc_norm\": 0.24545454545454545,\n\
\ \"acc_norm_stderr\": 0.041220665028782855\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2693877551020408,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.2693877551020408,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n\
\ \"acc_stderr\": 0.034605799075530255,\n \"acc_norm\": 0.2710843373493976,\n\
\ \"acc_norm_stderr\": 0.034605799075530255\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.03377310252209194,\n\
\ \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.03377310252209194\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931578,\n \"mc2\": 0.38680931810418795,\n\
\ \"mc2_stderr\": 0.013939564847231014\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5887924230465666,\n \"acc_stderr\": 0.013829128358676872\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.001153523489932886,\n \
\ \"em_stderr\": 0.0003476179896857114,\n \"f1\": 0.04587562919463095,\n\
\ \"f1_stderr\": 0.0011468980714363175\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.0075815011372251705,\n \"acc_stderr\": 0.002389281512077207\n\
\ }\n}\n```"
repo_url: https://huggingface.co/42dot/42dot_LLM-PLM-1.3B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|arc:challenge|25_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|arc:challenge|25_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|drop|3_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|drop|3_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|gsm8k|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|gsm8k|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hellaswag|10_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hellaswag|10_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-43-12.146243.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T08-12-34.029868.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-15T08-12-34.029868.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- '**/details_harness|winogrande|5_2023-11-13T15-43-12.146243.parquet'
- split: 2023_11_15T08_12_34.029868
path:
- '**/details_harness|winogrande|5_2023-11-15T08-12-34.029868.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-15T08-12-34.029868.parquet'
- config_name: results
data_files:
- split: 2023_11_13T15_43_12.146243
path:
- results_2023-11-13T15-43-12.146243.parquet
- split: 2023_11_15T08_12_34.029868
path:
- results_2023-11-15T08-12-34.029868.parquet
- split: latest
path:
- results_2023-11-15T08-12-34.029868.parquet
---
# Dataset Card for Evaluation run of 42dot/42dot_LLM-PLM-1.3B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/42dot/42dot_LLM-PLM-1.3B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [42dot/42dot_LLM-PLM-1.3B](https://huggingface.co/42dot/42dot_LLM-PLM-1.3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_42dot__42dot_LLM-PLM-1.3B_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-15T08:12:34.029868](https://huggingface.co/datasets/open-llm-leaderboard/details_42dot__42dot_LLM-PLM-1.3B_public/blob/main/results_2023-11-15T08-12-34.029868.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2748396034833794,
"acc_stderr": 0.03133274597965432,
"acc_norm": 0.2767290148254369,
"acc_norm_stderr": 0.032124763692846635,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931578,
"mc2": 0.38680931810418795,
"mc2_stderr": 0.013939564847231014,
"em": 0.001153523489932886,
"em_stderr": 0.0003476179896857114,
"f1": 0.04587562919463095,
"f1_stderr": 0.0011468980714363175
},
"harness|arc:challenge|25": {
"acc": 0.30119453924914674,
"acc_stderr": 0.013406741767847627,
"acc_norm": 0.3242320819112628,
"acc_norm_stderr": 0.01367881039951882
},
"harness|hellaswag|10": {
"acc": 0.4287990440151364,
"acc_stderr": 0.0049389301432344514,
"acc_norm": 0.563931487751444,
"acc_norm_stderr": 0.004948824501355477
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322716,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322716
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2679245283018868,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.2679245283018868,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.03716177437566016,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.03716177437566016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.27167630057803466,
"acc_stderr": 0.03391750322321659,
"acc_norm": 0.27167630057803466,
"acc_norm_stderr": 0.03391750322321659
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.03793281185307809,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.03793281185307809
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.31063829787234043,
"acc_stderr": 0.03025123757921317,
"acc_norm": 0.31063829787234043,
"acc_norm_stderr": 0.03025123757921317
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489362,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489362
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.18620689655172415,
"acc_stderr": 0.03243946159004616,
"acc_norm": 0.18620689655172415,
"acc_norm_stderr": 0.03243946159004616
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.0220190800122179,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.0220190800122179
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.038522733649243156,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.038522733649243156
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22580645161290322,
"acc_stderr": 0.02378557788418101,
"acc_norm": 0.22580645161290322,
"acc_norm_stderr": 0.02378557788418101
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.270935960591133,
"acc_stderr": 0.031270907132976984,
"acc_norm": 0.270935960591133,
"acc_norm_stderr": 0.031270907132976984
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.02962022787479047,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.02962022787479047
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3471502590673575,
"acc_stderr": 0.034356961683613546,
"acc_norm": 0.3471502590673575,
"acc_norm_stderr": 0.034356961683613546
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.34615384615384615,
"acc_stderr": 0.024121125416941183,
"acc_norm": 0.34615384615384615,
"acc_norm_stderr": 0.024121125416941183
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.027309140588230175,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.027309140588230175
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.29831932773109243,
"acc_stderr": 0.02971914287634286,
"acc_norm": 0.29831932773109243,
"acc_norm_stderr": 0.02971914287634286
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.26055045871559634,
"acc_stderr": 0.01881918203485007,
"acc_norm": 0.26055045871559634,
"acc_norm_stderr": 0.01881918203485007
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.0309645179269234,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.0309645179269234
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.23628691983122363,
"acc_stderr": 0.027652153144159263,
"acc_norm": 0.23628691983122363,
"acc_norm_stderr": 0.027652153144159263
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.25112107623318386,
"acc_stderr": 0.029105220833224622,
"acc_norm": 0.25112107623318386,
"acc_norm_stderr": 0.029105220833224622
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.20610687022900764,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.20610687022900764,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3884297520661157,
"acc_stderr": 0.04449270350068382,
"acc_norm": 0.3884297520661157,
"acc_norm_stderr": 0.04449270350068382
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26993865030674846,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.26993865030674846,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285713,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285713
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.28205128205128205,
"acc_stderr": 0.02948036054954119,
"acc_norm": 0.28205128205128205,
"acc_norm_stderr": 0.02948036054954119
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23243933588761176,
"acc_stderr": 0.015104550008905706,
"acc_norm": 0.23243933588761176,
"acc_norm_stderr": 0.015104550008905706
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.022075709251757177,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.022075709251757177
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2681564245810056,
"acc_stderr": 0.014816119635317003,
"acc_norm": 0.2681564245810056,
"acc_norm_stderr": 0.014816119635317003
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.025829163272757485,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.025829163272757485
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3183279742765273,
"acc_stderr": 0.02645722506781102,
"acc_norm": 0.3183279742765273,
"acc_norm_stderr": 0.02645722506781102
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.024659685185967277,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.024659685185967277
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2872340425531915,
"acc_stderr": 0.026992199173064356,
"acc_norm": 0.2872340425531915,
"acc_norm_stderr": 0.026992199173064356
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2392438070404172,
"acc_stderr": 0.010896123652676646,
"acc_norm": 0.2392438070404172,
"acc_norm_stderr": 0.010896123652676646
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.030187532060329376,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.030187532060329376
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2565359477124183,
"acc_stderr": 0.017667841612378984,
"acc_norm": 0.2565359477124183,
"acc_norm_stderr": 0.017667841612378984
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.041220665028782855,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.041220665028782855
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2693877551020408,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.2693877551020408,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.03014777593540922,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.03014777593540922
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.034605799075530255,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.034605799075530255
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.03377310252209194,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.03377310252209194
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931578,
"mc2": 0.38680931810418795,
"mc2_stderr": 0.013939564847231014
},
"harness|winogrande|5": {
"acc": 0.5887924230465666,
"acc_stderr": 0.013829128358676872
},
"harness|drop|3": {
"em": 0.001153523489932886,
"em_stderr": 0.0003476179896857114,
"f1": 0.04587562919463095,
"f1_stderr": 0.0011468980714363175
},
"harness|gsm8k|5": {
"acc": 0.0075815011372251705,
"acc_stderr": 0.002389281512077207
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_perlthoughts__Starling-LM-alpha-8x7B-MoE | ---
pretty_name: Evaluation run of perlthoughts/Starling-LM-alpha-8x7B-MoE
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [perlthoughts/Starling-LM-alpha-8x7B-MoE](https://huggingface.co/perlthoughts/Starling-LM-alpha-8x7B-MoE)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_perlthoughts__Starling-LM-alpha-8x7B-MoE\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-16T21:04:03.066898](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Starling-LM-alpha-8x7B-MoE/blob/main/results_2023-12-16T21-04-03.066898.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6485827629012115,\n\
\ \"acc_stderr\": 0.03189932261733175,\n \"acc_norm\": 0.6500544215634738,\n\
\ \"acc_norm_stderr\": 0.032542506459412,\n \"mc1\": 0.3047735618115055,\n\
\ \"mc1_stderr\": 0.01611412415688245,\n \"mc2\": 0.4639249177352108,\n\
\ \"mc2_stderr\": 0.015154559507326514\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5998293515358362,\n \"acc_stderr\": 0.014317197787809172,\n\
\ \"acc_norm\": 0.636518771331058,\n \"acc_norm_stderr\": 0.014056207319068283\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6655048795060745,\n\
\ \"acc_stderr\": 0.004708494114574018,\n \"acc_norm\": 0.8490340569607648,\n\
\ \"acc_norm_stderr\": 0.0035728399695219874\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.032500536843658404,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.032500536843658404\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5238095238095238,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.5238095238095238,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n\
\ \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n\
\ \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.035107665979592154,\n\
\ \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229865,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229865\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328974,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328974\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \
\ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078966,\n\
\ \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078966\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8270042194092827,\n \"acc_stderr\": 0.024621562866768434,\n \
\ \"acc_norm\": 0.8270042194092827,\n \"acc_norm_stderr\": 0.024621562866768434\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n\
\ \"acc_stderr\": 0.030360379710291947,\n \"acc_norm\": 0.7130044843049327,\n\
\ \"acc_norm_stderr\": 0.030360379710291947\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573973,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573973\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128138,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128138\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.47039106145251397,\n\
\ \"acc_stderr\": 0.016693154927383557,\n \"acc_norm\": 0.47039106145251397,\n\
\ \"acc_norm_stderr\": 0.016693154927383557\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729484,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729484\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46479791395045633,\n\
\ \"acc_stderr\": 0.012738547371303957,\n \"acc_norm\": 0.46479791395045633,\n\
\ \"acc_norm_stderr\": 0.012738547371303957\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.028064998167040094,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.028064998167040094\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495144,\n \
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495144\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3047735618115055,\n\
\ \"mc1_stderr\": 0.01611412415688245,\n \"mc2\": 0.4639249177352108,\n\
\ \"mc2_stderr\": 0.015154559507326514\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8058405682715075,\n \"acc_stderr\": 0.01111698339239267\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.624715693707354,\n \
\ \"acc_stderr\": 0.013337170545742927\n }\n}\n```"
repo_url: https://huggingface.co/perlthoughts/Starling-LM-alpha-8x7B-MoE
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|arc:challenge|25_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|gsm8k|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hellaswag|10_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T21-04-03.066898.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T21-04-03.066898.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- '**/details_harness|winogrande|5_2023-12-16T21-04-03.066898.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-16T21-04-03.066898.parquet'
- config_name: results
data_files:
- split: 2023_12_16T21_04_03.066898
path:
- results_2023-12-16T21-04-03.066898.parquet
- split: latest
path:
- results_2023-12-16T21-04-03.066898.parquet
---
# Dataset Card for Evaluation run of perlthoughts/Starling-LM-alpha-8x7B-MoE
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [perlthoughts/Starling-LM-alpha-8x7B-MoE](https://huggingface.co/perlthoughts/Starling-LM-alpha-8x7B-MoE) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_perlthoughts__Starling-LM-alpha-8x7B-MoE",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-16T21:04:03.066898](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Starling-LM-alpha-8x7B-MoE/blob/main/results_2023-12-16T21-04-03.066898.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6485827629012115,
"acc_stderr": 0.03189932261733175,
"acc_norm": 0.6500544215634738,
"acc_norm_stderr": 0.032542506459412,
"mc1": 0.3047735618115055,
"mc1_stderr": 0.01611412415688245,
"mc2": 0.4639249177352108,
"mc2_stderr": 0.015154559507326514
},
"harness|arc:challenge|25": {
"acc": 0.5998293515358362,
"acc_stderr": 0.014317197787809172,
"acc_norm": 0.636518771331058,
"acc_norm_stderr": 0.014056207319068283
},
"harness|hellaswag|10": {
"acc": 0.6655048795060745,
"acc_stderr": 0.004708494114574018,
"acc_norm": 0.8490340569607648,
"acc_norm_stderr": 0.0035728399695219874
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.032500536843658404,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.032500536843658404
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5238095238095238,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.5238095238095238,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229865,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229865
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328974,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328974
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078966,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078966
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8270042194092827,
"acc_stderr": 0.024621562866768434,
"acc_norm": 0.8270042194092827,
"acc_norm_stderr": 0.024621562866768434
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.030360379710291947,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.030360379710291947
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097653,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097653
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573973,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573973
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128138,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128138
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.47039106145251397,
"acc_stderr": 0.016693154927383557,
"acc_norm": 0.47039106145251397,
"acc_norm_stderr": 0.016693154927383557
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.025261691219729484,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.025261691219729484
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46479791395045633,
"acc_stderr": 0.012738547371303957,
"acc_norm": 0.46479791395045633,
"acc_norm_stderr": 0.012738547371303957
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.028064998167040094,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.028064998167040094
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495144,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495144
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578334,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3047735618115055,
"mc1_stderr": 0.01611412415688245,
"mc2": 0.4639249177352108,
"mc2_stderr": 0.015154559507326514
},
"harness|winogrande|5": {
"acc": 0.8058405682715075,
"acc_stderr": 0.01111698339239267
},
"harness|gsm8k|5": {
"acc": 0.624715693707354,
"acc_stderr": 0.013337170545742927
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
gsstein/100-percent-human-dataset-llama-og | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: summary
dtype: string
- name: text
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 87588201
num_examples: 15326
- name: test
num_bytes: 3113269
num_examples: 576
- name: validation
num_bytes: 3311381
num_examples: 576
download_size: 57544687
dataset_size: 94012851
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.