datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
CyberHarem/kokona_bluearchive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kokona/春原ココナ/心奈 (Blue Archive)
This is the dataset of kokona/春原ココナ/心奈 (Blue Archive), containing 500 images and their tags.
The core tags of this character are `long_hair, grey_hair, multicolored_hair, animal_ears, streaked_hair, black_hair, halo, very_long_hair, tiger_ears, breasts, small_breasts, brown_eyes, orange_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 753.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kokona_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 633.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kokona_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1318 | 1.33 GiB | [Download](https://huggingface.co/datasets/CyberHarem/kokona_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kokona_bluearchive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 35 |  |  |  |  |  | 1girl, black_dress, looking_at_viewer, solo, vertical-striped_clothes, black_jacket, diamond_cutout, simple_background, long_sleeves, blush, vertical-striped_dress, white_background, off_shoulder, chinese_clothes, closed_mouth, white_skirt, sleeveless, clothing_cutout, open_clothes, cowboy_shot, frilled_skirt |
| 1 | 6 |  |  |  |  |  | 1girl, black_dress, black_jacket, black_socks, full_body, long_sleeves, pelvic_curtain, simple_background, sneakers, solo, vertical-striped_clothes, vertical-striped_dress, diamond_cutout, off_shoulder, standing, white_background, white_footwear, blush, closed_mouth, sleeveless, white_skirt, bag, china_dress, looking_at_viewer, open_clothes, pout |
| 2 | 5 |  |  |  |  |  | 1boy, 1girl, black_dress, blush, hetero, loli, black_jacket, simple_background, solo_focus, vertical-striped_dress, blue_halo, looking_at_viewer, sleeveless, tongue_out, uncensored, vertical-striped_clothes, licking_penis, open_mouth, pov |
| 3 | 7 |  |  |  |  |  | 1boy, 1girl, blush, hetero, loli, navel, nipples, open_mouth, penis, spread_legs, blue_halo, completely_nude, pussy, sex, solo_focus, vaginal, bar_censor, tiger_girl, stomach_bulge, yellow_eyes |
| 4 | 7 |  |  |  |  |  | 1girl, blue_halo, blush, navel, simple_background, loli, micro_bikini, white_bikini, open_mouth, solo, cameltoe, cowboy_shot, looking_at_viewer, tiger_girl, white_background, collarbone, heart, parted_lips |
| 5 | 18 |  |  |  |  |  | 1girl, looking_at_viewer, pussy, completely_nude, loli, navel, blush, nipples, solo, barefoot, blue_halo, toes, cleft_of_venus, uncensored, anus, closed_mouth, collarbone, feet, flat_chest, simple_background, soles, spread_legs, white_background, open_mouth |
| 6 | 10 |  |  |  |  |  | blush, white_apron, 1girl, frilled_apron, simple_background, maid_headdress, blue_halo, enmaided, looking_at_viewer, solo, white_background, black_dress, maid_apron, puffy_short_sleeves, waist_apron, hair_between_eyes, black_footwear, bowtie, full_body, open_mouth, shoes, white_thighhighs |
| 7 | 12 |  |  |  |  |  | 1girl, alternate_costume, playboy_bunny, strapless_leotard, detached_collar, looking_at_viewer, solo, blush, fake_animal_ears, open_mouth, rabbit_ears, simple_background, white_background, wrist_cuffs, bare_shoulders, black_leotard, covered_navel, highleg_leotard, pantyhose, red_bowtie, blue_halo |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_dress | looking_at_viewer | solo | vertical-striped_clothes | black_jacket | diamond_cutout | simple_background | long_sleeves | blush | vertical-striped_dress | white_background | off_shoulder | chinese_clothes | closed_mouth | white_skirt | sleeveless | clothing_cutout | open_clothes | cowboy_shot | frilled_skirt | black_socks | full_body | pelvic_curtain | sneakers | standing | white_footwear | bag | china_dress | pout | 1boy | hetero | loli | solo_focus | blue_halo | tongue_out | uncensored | licking_penis | open_mouth | pov | navel | nipples | penis | spread_legs | completely_nude | pussy | sex | vaginal | bar_censor | tiger_girl | stomach_bulge | yellow_eyes | micro_bikini | white_bikini | cameltoe | collarbone | heart | parted_lips | barefoot | toes | cleft_of_venus | anus | feet | flat_chest | soles | white_apron | frilled_apron | maid_headdress | enmaided | maid_apron | puffy_short_sleeves | waist_apron | hair_between_eyes | black_footwear | bowtie | shoes | white_thighhighs | alternate_costume | playboy_bunny | strapless_leotard | detached_collar | fake_animal_ears | rabbit_ears | wrist_cuffs | bare_shoulders | black_leotard | covered_navel | highleg_leotard | pantyhose | red_bowtie |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:--------------------|:-------|:---------------------------|:---------------|:-----------------|:--------------------|:---------------|:--------|:-------------------------|:-------------------|:---------------|:------------------|:---------------|:--------------|:-------------|:------------------|:---------------|:--------------|:----------------|:--------------|:------------|:-----------------|:-----------|:-----------|:-----------------|:------|:--------------|:-------|:-------|:---------|:-------|:-------------|:------------|:-------------|:-------------|:----------------|:-------------|:------|:--------|:----------|:--------|:--------------|:------------------|:--------|:------|:----------|:-------------|:-------------|:----------------|:--------------|:---------------|:---------------|:-----------|:-------------|:--------|:--------------|:-----------|:-------|:-----------------|:-------|:-------|:-------------|:--------|:--------------|:----------------|:-----------------|:-----------|:-------------|:----------------------|:--------------|:--------------------|:-----------------|:---------|:--------|:-------------------|:--------------------|:----------------|:--------------------|:------------------|:-------------------|:--------------|:--------------|:-----------------|:----------------|:----------------|:------------------|:------------|:-------------|
| 0 | 35 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | X | X | X | | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | | X | X | | X | | X | X | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | | X | X | | | | X | | X | | X | | | | | | | | X | | | | | | | | | | | | | X | | X | | | | X | | X | | | | | | | | | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 18 |  |  |  |  |  | X | | X | X | | | | X | | X | | X | | | X | | | | | | | | | | | | | | | | | | X | | X | | X | | X | | X | X | | X | X | X | | | | | | | | | | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 10 |  |  |  |  |  | X | X | X | X | | | | X | | X | | X | | | | | | | | | | | X | | | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 7 | 12 |  |  |  |  |  | X | | X | X | | | | X | | X | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_Lvxy1117__amber_fine_tune_ori | ---
pretty_name: Evaluation run of Lvxy1117/amber_fine_tune_ori
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Lvxy1117/amber_fine_tune_ori](https://huggingface.co/Lvxy1117/amber_fine_tune_ori)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Lvxy1117__amber_fine_tune_ori\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T04:45:56.417664](https://huggingface.co/datasets/open-llm-leaderboard/details_Lvxy1117__amber_fine_tune_ori/blob/main/results_2024-02-02T04-45-56.417664.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2703950535933958,\n\
\ \"acc_stderr\": 0.03111275987496194,\n \"acc_norm\": 0.2718823298213839,\n\
\ \"acc_norm_stderr\": 0.03187597464930614,\n \"mc1\": 0.20930232558139536,\n\
\ \"mc1_stderr\": 0.014241219434785827,\n \"mc2\": 0.3493607326810005,\n\
\ \"mc2_stderr\": 0.015039024893260722\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.42406143344709896,\n \"acc_stderr\": 0.0144418896274644,\n\
\ \"acc_norm\": 0.4445392491467577,\n \"acc_norm_stderr\": 0.014521226405627079\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5802628958374826,\n\
\ \"acc_stderr\": 0.004925072159723838,\n \"acc_norm\": 0.7510456084445329,\n\
\ \"acc_norm_stderr\": 0.004315236154543956\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n\
\ \"acc_stderr\": 0.03633384414073465,\n \"acc_norm\": 0.22962962962962963,\n\
\ \"acc_norm_stderr\": 0.03633384414073465\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.16447368421052633,\n \"acc_stderr\": 0.030167533468632726,\n\
\ \"acc_norm\": 0.16447368421052633,\n \"acc_norm_stderr\": 0.030167533468632726\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.33,\n\
\ \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.27547169811320754,\n \"acc_stderr\": 0.02749566368372406,\n\
\ \"acc_norm\": 0.27547169811320754,\n \"acc_norm_stderr\": 0.02749566368372406\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.03586879280080339,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.03586879280080339\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n\
\ \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n\
\ \"acc_stderr\": 0.03186209851641144,\n \"acc_norm\": 0.2254335260115607,\n\
\ \"acc_norm_stderr\": 0.03186209851641144\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3276595744680851,\n \"acc_stderr\": 0.030683020843231008,\n\
\ \"acc_norm\": 0.3276595744680851,\n \"acc_norm_stderr\": 0.030683020843231008\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.0404933929774814,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.0404933929774814\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.036646663372252565,\n\
\ \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.036646663372252565\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24867724867724866,\n \"acc_stderr\": 0.02226181769240016,\n \"\
acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.02226181769240016\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.03932537680392871,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.03932537680392871\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.2032258064516129,\n \"acc_stderr\": 0.022891687984554966,\n \"\
acc_norm\": 0.2032258064516129,\n \"acc_norm_stderr\": 0.022891687984554966\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.26108374384236455,\n \"acc_stderr\": 0.030903796952114485,\n \"\
acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.030903796952114485\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.25757575757575757,\n \"acc_stderr\": 0.03115626951964683,\n \"\
acc_norm\": 0.25757575757575757,\n \"acc_norm_stderr\": 0.03115626951964683\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21243523316062177,\n \"acc_stderr\": 0.02951928261681723,\n\
\ \"acc_norm\": 0.21243523316062177,\n \"acc_norm_stderr\": 0.02951928261681723\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2153846153846154,\n \"acc_stderr\": 0.020843034557462878,\n\
\ \"acc_norm\": 0.2153846153846154,\n \"acc_norm_stderr\": 0.020843034557462878\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073835,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073835\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2184873949579832,\n \"acc_stderr\": 0.02684151432295893,\n \
\ \"acc_norm\": 0.2184873949579832,\n \"acc_norm_stderr\": 0.02684151432295893\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2185430463576159,\n \"acc_stderr\": 0.03374235550425694,\n \"\
acc_norm\": 0.2185430463576159,\n \"acc_norm_stderr\": 0.03374235550425694\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24036697247706423,\n \"acc_stderr\": 0.01832060732096407,\n \"\
acc_norm\": 0.24036697247706423,\n \"acc_norm_stderr\": 0.01832060732096407\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1574074074074074,\n \"acc_stderr\": 0.024837173518242387,\n \"\
acc_norm\": 0.1574074074074074,\n \"acc_norm_stderr\": 0.024837173518242387\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.23039215686274508,\n \"acc_stderr\": 0.029554292605695063,\n \"\
acc_norm\": 0.23039215686274508,\n \"acc_norm_stderr\": 0.029554292605695063\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658346,\n \
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658346\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.39461883408071746,\n\
\ \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.39461883408071746,\n\
\ \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3053435114503817,\n \"acc_stderr\": 0.04039314978724561,\n\
\ \"acc_norm\": 0.3053435114503817,\n \"acc_norm_stderr\": 0.04039314978724561\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.23140495867768596,\n \"acc_stderr\": 0.03849856098794089,\n \"\
acc_norm\": 0.23140495867768596,\n \"acc_norm_stderr\": 0.03849856098794089\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.032910995786157686,\n\
\ \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.032910995786157686\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.0449394906861354,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.0449394906861354\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.24271844660194175,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.24271844660194175,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.32905982905982906,\n\
\ \"acc_stderr\": 0.030782321577688156,\n \"acc_norm\": 0.32905982905982906,\n\
\ \"acc_norm_stderr\": 0.030782321577688156\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.3128991060025543,\n\
\ \"acc_stderr\": 0.01658093594030406,\n \"acc_norm\": 0.3128991060025543,\n\
\ \"acc_norm_stderr\": 0.01658093594030406\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.27167630057803466,\n \"acc_stderr\": 0.02394851290546835,\n\
\ \"acc_norm\": 0.27167630057803466,\n \"acc_norm_stderr\": 0.02394851290546835\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n\
\ \"acc_stderr\": 0.014355911964767864,\n \"acc_norm\": 0.2435754189944134,\n\
\ \"acc_norm_stderr\": 0.014355911964767864\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02564686309713791,\n\
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02564686309713791\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24758842443729903,\n\
\ \"acc_stderr\": 0.024513879973621967,\n \"acc_norm\": 0.24758842443729903,\n\
\ \"acc_norm_stderr\": 0.024513879973621967\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2932098765432099,\n \"acc_stderr\": 0.025329888171900922,\n\
\ \"acc_norm\": 0.2932098765432099,\n \"acc_norm_stderr\": 0.025329888171900922\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24822695035460993,\n \"acc_stderr\": 0.025770015644290382,\n \
\ \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.025770015644290382\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27640156453715775,\n\
\ \"acc_stderr\": 0.01142215319455358,\n \"acc_norm\": 0.27640156453715775,\n\
\ \"acc_norm_stderr\": 0.01142215319455358\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.1948529411764706,\n \"acc_stderr\": 0.024060599423487414,\n\
\ \"acc_norm\": 0.1948529411764706,\n \"acc_norm_stderr\": 0.024060599423487414\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2696078431372549,\n \"acc_stderr\": 0.017952449196987862,\n \
\ \"acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.017952449196987862\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2727272727272727,\n\
\ \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.2727272727272727,\n\
\ \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.19183673469387755,\n \"acc_stderr\": 0.02520696315422541,\n\
\ \"acc_norm\": 0.19183673469387755,\n \"acc_norm_stderr\": 0.02520696315422541\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409217,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409217\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3313253012048193,\n\
\ \"acc_stderr\": 0.036643147772880864,\n \"acc_norm\": 0.3313253012048193,\n\
\ \"acc_norm_stderr\": 0.036643147772880864\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4093567251461988,\n \"acc_stderr\": 0.037712831076265434,\n\
\ \"acc_norm\": 0.4093567251461988,\n \"acc_norm_stderr\": 0.037712831076265434\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.20930232558139536,\n\
\ \"mc1_stderr\": 0.014241219434785827,\n \"mc2\": 0.3493607326810005,\n\
\ \"mc2_stderr\": 0.015039024893260722\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6314127861089187,\n \"acc_stderr\": 0.013558447570099323\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01288855193328279,\n \
\ \"acc_stderr\": 0.0031069012664996674\n }\n}\n```"
repo_url: https://huggingface.co/Lvxy1117/amber_fine_tune_ori
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|arc:challenge|25_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|gsm8k|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hellaswag|10_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T04-45-56.417664.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T04-45-56.417664.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- '**/details_harness|winogrande|5_2024-02-02T04-45-56.417664.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T04-45-56.417664.parquet'
- config_name: results
data_files:
- split: 2024_02_02T04_45_56.417664
path:
- results_2024-02-02T04-45-56.417664.parquet
- split: latest
path:
- results_2024-02-02T04-45-56.417664.parquet
---
# Dataset Card for Evaluation run of Lvxy1117/amber_fine_tune_ori
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Lvxy1117/amber_fine_tune_ori](https://huggingface.co/Lvxy1117/amber_fine_tune_ori) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Lvxy1117__amber_fine_tune_ori",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T04:45:56.417664](https://huggingface.co/datasets/open-llm-leaderboard/details_Lvxy1117__amber_fine_tune_ori/blob/main/results_2024-02-02T04-45-56.417664.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2703950535933958,
"acc_stderr": 0.03111275987496194,
"acc_norm": 0.2718823298213839,
"acc_norm_stderr": 0.03187597464930614,
"mc1": 0.20930232558139536,
"mc1_stderr": 0.014241219434785827,
"mc2": 0.3493607326810005,
"mc2_stderr": 0.015039024893260722
},
"harness|arc:challenge|25": {
"acc": 0.42406143344709896,
"acc_stderr": 0.0144418896274644,
"acc_norm": 0.4445392491467577,
"acc_norm_stderr": 0.014521226405627079
},
"harness|hellaswag|10": {
"acc": 0.5802628958374826,
"acc_stderr": 0.004925072159723838,
"acc_norm": 0.7510456084445329,
"acc_norm_stderr": 0.004315236154543956
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073465,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073465
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.16447368421052633,
"acc_stderr": 0.030167533468632726,
"acc_norm": 0.16447368421052633,
"acc_norm_stderr": 0.030167533468632726
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27547169811320754,
"acc_stderr": 0.02749566368372406,
"acc_norm": 0.27547169811320754,
"acc_norm_stderr": 0.02749566368372406
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.03586879280080339,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.03586879280080339
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.03186209851641144,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.03186209851641144
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3276595744680851,
"acc_stderr": 0.030683020843231008,
"acc_norm": 0.3276595744680851,
"acc_norm_stderr": 0.030683020843231008
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.0404933929774814,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.0404933929774814
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2620689655172414,
"acc_stderr": 0.036646663372252565,
"acc_norm": 0.2620689655172414,
"acc_norm_stderr": 0.036646663372252565
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.02226181769240016,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.02226181769240016
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.03932537680392871,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.03932537680392871
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2032258064516129,
"acc_stderr": 0.022891687984554966,
"acc_norm": 0.2032258064516129,
"acc_norm_stderr": 0.022891687984554966
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.030903796952114485,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.030903796952114485
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.25757575757575757,
"acc_stderr": 0.03115626951964683,
"acc_norm": 0.25757575757575757,
"acc_norm_stderr": 0.03115626951964683
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21243523316062177,
"acc_stderr": 0.02951928261681723,
"acc_norm": 0.21243523316062177,
"acc_norm_stderr": 0.02951928261681723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2153846153846154,
"acc_stderr": 0.020843034557462878,
"acc_norm": 0.2153846153846154,
"acc_norm_stderr": 0.020843034557462878
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073835,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073835
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2184873949579832,
"acc_stderr": 0.02684151432295893,
"acc_norm": 0.2184873949579832,
"acc_norm_stderr": 0.02684151432295893
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2185430463576159,
"acc_stderr": 0.03374235550425694,
"acc_norm": 0.2185430463576159,
"acc_norm_stderr": 0.03374235550425694
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24036697247706423,
"acc_stderr": 0.01832060732096407,
"acc_norm": 0.24036697247706423,
"acc_norm_stderr": 0.01832060732096407
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1574074074074074,
"acc_stderr": 0.024837173518242387,
"acc_norm": 0.1574074074074074,
"acc_norm_stderr": 0.024837173518242387
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23039215686274508,
"acc_stderr": 0.029554292605695063,
"acc_norm": 0.23039215686274508,
"acc_norm_stderr": 0.029554292605695063
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.028756799629658346,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.028756799629658346
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.39461883408071746,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.39461883408071746,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3053435114503817,
"acc_stderr": 0.04039314978724561,
"acc_norm": 0.3053435114503817,
"acc_norm_stderr": 0.04039314978724561
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.23140495867768596,
"acc_stderr": 0.03849856098794089,
"acc_norm": 0.23140495867768596,
"acc_norm_stderr": 0.03849856098794089
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.0449394906861354,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.0449394906861354
},
"harness|hendrycksTest-management|5": {
"acc": 0.24271844660194175,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.24271844660194175,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.32905982905982906,
"acc_stderr": 0.030782321577688156,
"acc_norm": 0.32905982905982906,
"acc_norm_stderr": 0.030782321577688156
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.3128991060025543,
"acc_stderr": 0.01658093594030406,
"acc_norm": 0.3128991060025543,
"acc_norm_stderr": 0.01658093594030406
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.27167630057803466,
"acc_stderr": 0.02394851290546835,
"acc_norm": 0.27167630057803466,
"acc_norm_stderr": 0.02394851290546835
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767864,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767864
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02564686309713791,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02564686309713791
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24758842443729903,
"acc_stderr": 0.024513879973621967,
"acc_norm": 0.24758842443729903,
"acc_norm_stderr": 0.024513879973621967
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2932098765432099,
"acc_stderr": 0.025329888171900922,
"acc_norm": 0.2932098765432099,
"acc_norm_stderr": 0.025329888171900922
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.025770015644290382,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.025770015644290382
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.27640156453715775,
"acc_stderr": 0.01142215319455358,
"acc_norm": 0.27640156453715775,
"acc_norm_stderr": 0.01142215319455358
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.1948529411764706,
"acc_stderr": 0.024060599423487414,
"acc_norm": 0.1948529411764706,
"acc_norm_stderr": 0.024060599423487414
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.017952449196987862,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.017952449196987862
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04265792110940588,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04265792110940588
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19183673469387755,
"acc_stderr": 0.02520696315422541,
"acc_norm": 0.19183673469387755,
"acc_norm_stderr": 0.02520696315422541
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409217,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409217
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3313253012048193,
"acc_stderr": 0.036643147772880864,
"acc_norm": 0.3313253012048193,
"acc_norm_stderr": 0.036643147772880864
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4093567251461988,
"acc_stderr": 0.037712831076265434,
"acc_norm": 0.4093567251461988,
"acc_norm_stderr": 0.037712831076265434
},
"harness|truthfulqa:mc|0": {
"mc1": 0.20930232558139536,
"mc1_stderr": 0.014241219434785827,
"mc2": 0.3493607326810005,
"mc2_stderr": 0.015039024893260722
},
"harness|winogrande|5": {
"acc": 0.6314127861089187,
"acc_stderr": 0.013558447570099323
},
"harness|gsm8k|5": {
"acc": 0.01288855193328279,
"acc_stderr": 0.0031069012664996674
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
jkorsvik/nowiki_abstract_second_scrape_split2 | ---
dataset_info:
features:
- name: url
dtype: string
- name: date_scraped
dtype: string
- name: headline
dtype: string
- name: category
dtype: string
- name: ingress
dtype: string
- name: article
dtype: string
- name: abstract
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 758224560
num_examples: 201819
download_size: 358042111
dataset_size: 758224560
---
# Dataset Card for "nowiki_abstract_second_scrape_split2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yimingzhang/mmlu_3 | ---
license: mit
task_categories:
- question-answering
language:
- en
pretty_name: MMLU loader with no auxiliary train set
---
This dataset contains a copy of the `cais/mmlu` HF dataset but without the `auxiliary_train` split that takes a long time to generate again each time when loading multiple subsets of the dataset.
Please visit https://huggingface.co/datasets/cais/mmlu for more information on the MMLU dataset. |
maulinnasari/dataset_ext_100_mn | ---
dataset_info:
features:
- name: document
sequence: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 549593593
num_examples: 44972
- name: validation
num_bytes: 67189862
num_examples: 5622
- name: test
num_bytes: 68880410
num_examples: 5622
download_size: 394973176
dataset_size: 685663865
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_Kquant03__Buttercup-V2-bf16 | ---
pretty_name: Evaluation run of Kquant03/Buttercup-V2-bf16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Kquant03/Buttercup-V2-bf16](https://huggingface.co/Kquant03/Buttercup-V2-bf16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kquant03__Buttercup-V2-bf16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-15T23:01:38.445097](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Buttercup-V2-bf16/blob/main/results_2024-02-15T23-01-38.445097.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6530538161665846,\n\
\ \"acc_stderr\": 0.031991249389942286,\n \"acc_norm\": 0.6524163789680969,\n\
\ \"acc_norm_stderr\": 0.03266539743840073,\n \"mc1\": 0.554467564259486,\n\
\ \"mc1_stderr\": 0.01739933528014034,\n \"mc2\": 0.6947306262348207,\n\
\ \"mc2_stderr\": 0.015031157853542046\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7039249146757679,\n \"acc_stderr\": 0.013340916085246258,\n\
\ \"acc_norm\": 0.7372013651877133,\n \"acc_norm_stderr\": 0.012862523175351335\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7112129057956582,\n\
\ \"acc_stderr\": 0.004522725412556955,\n \"acc_norm\": 0.885381398127863,\n\
\ \"acc_norm_stderr\": 0.003179100565887989\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n\
\ \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n\
\ \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n\
\ \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n\
\ \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n\
\ \"acc_stderr\": 0.03243618636108101,\n \"acc_norm\": 0.5617021276595745,\n\
\ \"acc_norm_stderr\": 0.03243618636108101\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n\
\ \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n \"\
acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4470899470899471,\n \"acc_stderr\": 0.025606723995777025,\n \"\
acc_norm\": 0.4470899470899471,\n \"acc_norm_stderr\": 0.025606723995777025\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.02328766512726854,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.02328766512726854\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8181818181818182,\n \"acc_stderr\": 0.027479603010538797,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.027479603010538797\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083008,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083008\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297794,\n \
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297794\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092448,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092448\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n\
\ \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n\
\ \"acc_stderr\": 0.0133064782430663,\n \"acc_norm\": 0.8339719029374202,\n\
\ \"acc_norm_stderr\": 0.0133064782430663\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42793296089385474,\n\
\ \"acc_stderr\": 0.01654788799741611,\n \"acc_norm\": 0.42793296089385474,\n\
\ \"acc_norm_stderr\": 0.01654788799741611\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042103,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042103\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n\
\ \"acc_stderr\": 0.01274307294265335,\n \"acc_norm\": 0.46740547588005216,\n\
\ \"acc_norm_stderr\": 0.01274307294265335\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495148,\n \
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495148\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.554467564259486,\n\
\ \"mc1_stderr\": 0.01739933528014034,\n \"mc2\": 0.6947306262348207,\n\
\ \"mc2_stderr\": 0.015031157853542046\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8650355169692187,\n \"acc_stderr\": 0.009603064913219049\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.686125852918878,\n \
\ \"acc_stderr\": 0.012782681251053198\n }\n}\n```"
repo_url: https://huggingface.co/Kquant03/Buttercup-V2-bf16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|arc:challenge|25_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|gsm8k|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hellaswag|10_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T23-01-38.445097.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-15T23-01-38.445097.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- '**/details_harness|winogrande|5_2024-02-15T23-01-38.445097.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-15T23-01-38.445097.parquet'
- config_name: results
data_files:
- split: 2024_02_15T23_01_38.445097
path:
- results_2024-02-15T23-01-38.445097.parquet
- split: latest
path:
- results_2024-02-15T23-01-38.445097.parquet
---
# Dataset Card for Evaluation run of Kquant03/Buttercup-V2-bf16
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kquant03/Buttercup-V2-bf16](https://huggingface.co/Kquant03/Buttercup-V2-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kquant03__Buttercup-V2-bf16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T23:01:38.445097](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Buttercup-V2-bf16/blob/main/results_2024-02-15T23-01-38.445097.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6530538161665846,
"acc_stderr": 0.031991249389942286,
"acc_norm": 0.6524163789680969,
"acc_norm_stderr": 0.03266539743840073,
"mc1": 0.554467564259486,
"mc1_stderr": 0.01739933528014034,
"mc2": 0.6947306262348207,
"mc2_stderr": 0.015031157853542046
},
"harness|arc:challenge|25": {
"acc": 0.7039249146757679,
"acc_stderr": 0.013340916085246258,
"acc_norm": 0.7372013651877133,
"acc_norm_stderr": 0.012862523175351335
},
"harness|hellaswag|10": {
"acc": 0.7112129057956582,
"acc_stderr": 0.004522725412556955,
"acc_norm": 0.885381398127863,
"acc_norm_stderr": 0.003179100565887989
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4470899470899471,
"acc_stderr": 0.025606723995777025,
"acc_norm": 0.4470899470899471,
"acc_norm_stderr": 0.025606723995777025
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726854,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726854
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.027479603010538797,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.027479603010538797
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083008,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083008
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297794,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297794
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092448,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092448
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8339719029374202,
"acc_stderr": 0.0133064782430663,
"acc_norm": 0.8339719029374202,
"acc_norm_stderr": 0.0133064782430663
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42793296089385474,
"acc_stderr": 0.01654788799741611,
"acc_norm": 0.42793296089385474,
"acc_norm_stderr": 0.01654788799741611
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042103,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042103
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.01274307294265335,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.01274307294265335
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462923,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462923
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495148,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495148
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.554467564259486,
"mc1_stderr": 0.01739933528014034,
"mc2": 0.6947306262348207,
"mc2_stderr": 0.015031157853542046
},
"harness|winogrande|5": {
"acc": 0.8650355169692187,
"acc_stderr": 0.009603064913219049
},
"harness|gsm8k|5": {
"acc": 0.686125852918878,
"acc_stderr": 0.012782681251053198
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
TokenBender/Bengali_chat_dataset | ---
license: apache-2.0
---
|
ShawnGGG/models | ---
license: openrail
---
|
proteinea/cov-abdab | ---
dataset_info:
features:
- name: sequences
dtype: string
splits:
- name: train
num_bytes: 884
num_examples: 7
download_size: 2106
dataset_size: 884
---
# Dataset Card for "cov-abdab"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
316usman/lseembed | ---
dataset_info:
features:
- name: text
dtype: string
- name: scope
dtype: string
- name: document_url
dtype: string
- name: source_url
dtype: string
splits:
- name: train
num_bytes: 99923372
num_examples: 155371
download_size: 36257994
dataset_size: 99923372
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
TIGER-Lab/Homework-Judge | ---
dataset_info:
features:
- name: user_prompt
dtype: string
- name: output
dtype: string
splits:
- name: extractor
num_bytes: 1039329
num_examples: 833
- name: comparator
num_bytes: 336955
num_examples: 639
download_size: 427476
dataset_size: 1376284
configs:
- config_name: default
data_files:
- split: extractor
path: data/extractor-*
- split: comparator
path: data/comparator-*
---
|
CVasNLPExperiments/DTD_parition1_test_google_flan_t5_xl_mode_A_ns_1880 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0__Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_clip_tags_LAION_ViT_H_14_2B_simple_specific_rices
num_bytes: 773321
num_examples: 1880
download_size: 174097
dataset_size: 773321
---
# Dataset Card for "DTD_parition1_test_google_flan_t5_xl_mode_A_ns_1880"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KrushiJethe/fashion_data | ---
dataset_info:
features:
- name: image
dtype: image
- name: 'Unnamed: 0'
dtype: int64
- name: articleType
dtype: string
- name: productDisplayName
dtype: string
- name: articleType_label
dtype: int64
splits:
- name: train
num_bytes: 140935233.8
num_examples: 9300
download_size: 122008451
dataset_size: 140935233.8
---
# Dataset Card for "fashion_data"
The dataset consists of 31 classes and each class has 300 images along with the productDisplayName. This was created from the a larger dataset which you can find [here](https://www.kaggle.com/datasets/paramaggarwal/fashion-product-images-dataset).
The purpose of creating this dataset was to make a Image search engine from a database where the input is either an image, text or audio and the output is a set of images similar to the input.
You can find the project implementation [here](https://github.com/Krushi-Jethe/Image-Search-Engine).
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
prin2eugen/SDconfig | ---
license: unknown
---
|
DanteWu/CBC_Material | ---
license: afl-3.0
---
|
Anthropic/model-written-evals | ---
annotations_creators:
- machine-generated
language:
- en
language_creators:
- machine-generated
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: Evaluations from "Discovering Language Model Behaviors with Model-Written
Evaluations"
size_categories:
- 100K<n<1M
source_datasets:
- original
tags:
- gender bias
- social bias
- AI safety
- personality
- politics
task_categories:
- multiple-choice
- zero-shot-classification
- question-answering
task_ids:
- multiple-choice-qa
- multiple-choice-coreference-resolution
---
# Model-Written Evaluation Datasets
This repository includes datasets written by language models, used in our paper on "Discovering Language Model Behaviors with Model-Written Evaluations."
We intend the datasets to be useful to:
1. Those who are interested in understanding the quality and properties of model-generated data
2. Those who wish to use our datasets to evaluate other models for the behaviors we examined in our work (e.g., related to model persona, sycophancy, advanced AI risks, and gender bias)
The evaluations were generated to be asked to dialogue agents (e.g., a model finetuned explicitly respond to a user's utterances, or a pretrained language model prompted to behave like a dialogue agent). However, it is possible to adapt the data to test other kinds of models as well.
We describe each of our collections of datasets below:
1. `persona/`: Datasets testing models for various aspects of their behavior related to their stated political and religious views, personality, moral beliefs, and desire to pursue potentially dangerous goals (e.g., self-preservation or power-seeking).
2. `sycophancy/`: Datasets testing models for whether or not they repeat back a user's view to various questions (in philosophy, NLP research, and politics)
3. `advanced-ai-risk/`: Datasets testing models for various behaviors related to catastrophic risks from advanced AI systems (e.g., ). These datasets were generated in a few-shot manner. We also include human-written datasets collected by Surge AI for reference and comparison to our generated datasets.
4. `winogenerated/`: Our larger, model-generated version of the Winogender Dataset ([Rudinger et al., 2018](https://arxiv.org/abs/1804.09301)). We also include the names of occupation titles that we generated, to create the dataset (alongside occupation gender statistics from the Bureau of Labor Statistics)
Please see our paper for additional details on the datasets, how we generated them, human validation metrics, and other analyses of the datasets.
**Disclaimer**: As discussed in our paper, some data contains content that includes social biases and stereotypes. The data may also contain other forms of harmful or offensive content. The views expressed in the data do not reflect the views of Anthropic or any of its employees.
## Contact
For questions, please email `ethan at anthropic dot com`
## Bibtex Citation
If you would like to cite our work or data, you may use the following bibtex citation:
```
@misc{perez2022discovering,
doi = {10.48550/ARXIV.2212.09251},
url = {https://arxiv.org/abs/2212.09251},
author = {Perez, Ethan and Ringer, Sam and Lukošiūtė, Kamilė and Nguyen, Karina and Chen, Edwin and Heiner, Scott and Pettit, Craig and Olsson, Catherine and Kundu, Sandipan and Kadavath, Saurav and Jones, Andy and Chen, Anna and Mann, Ben and Israel, Brian and Seethor, Bryan and McKinnon, Cameron and Olah, Christopher and Yan, Da and Amodei, Daniela and Amodei, Dario and Drain, Dawn and Li, Dustin and Tran-Johnson, Eli and Khundadze, Guro and Kernion, Jackson and Landis, James and Kerr, Jamie and Mueller, Jared and Hyun, Jeeyoon and Landau, Joshua and Ndousse, Kamal and Goldberg, Landon and Lovitt, Liane and Lucas, Martin and Sellitto, Michael and Zhang, Miranda and Kingsland, Neerav and Elhage, Nelson and Joseph, Nicholas and Mercado, Noemí and DasSarma, Nova and Rausch, Oliver and Larson, Robin and McCandlish, Sam and Johnston, Scott and Kravec, Shauna and {El Showk}, Sheer and Lanham, Tamera and Telleen-Lawton, Timothy and Brown, Tom and Henighan, Tom and Hume, Tristan and Bai, Yuntao and Hatfield-Dodds, Zac and Clark, Jack and Bowman, Samuel R. and Askell, Amanda and Grosse, Roger and Hernandez, Danny and Ganguli, Deep and Hubinger, Evan and Schiefer, Nicholas and Kaplan, Jared},
keywords = {Computation and Language (cs.CL), Artificial Intelligence (cs.AI), Machine Learning (cs.LG), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Discovering Language Model Behaviors with Model-Written Evaluations},
publisher = {arXiv},
year = {2022},
copyright = {arXiv.org perpetual, non-exclusive license}
}
```
|
chenyouyou/translation-en-de | ---
license: openrail
task_categories:
- translation
language:
- en
- de
pretty_name: translation_demo
--- |
heliosprime/twitter_dataset_1713155216 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 6634
num_examples: 18
download_size: 10269
dataset_size: 6634
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713155216"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_DUAL-GPO__phi-2-dpo-renew1 | ---
pretty_name: Evaluation run of DUAL-GPO/phi-2-dpo-renew1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DUAL-GPO/phi-2-dpo-renew1](https://huggingface.co/DUAL-GPO/phi-2-dpo-renew1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DUAL-GPO__phi-2-dpo-renew1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T23:06:52.610510](https://huggingface.co/datasets/open-llm-leaderboard/details_DUAL-GPO__phi-2-dpo-renew1/blob/main/results_2024-04-15T23-06-52.610510.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.585468211023594,\n\
\ \"acc_stderr\": 0.033738595435610585,\n \"acc_norm\": 0.5876967000100922,\n\
\ \"acc_norm_stderr\": 0.034421919213948445,\n \"mc1\": 0.37209302325581395,\n\
\ \"mc1_stderr\": 0.016921090118814035,\n \"mc2\": 0.5119014338994788,\n\
\ \"mc2_stderr\": 0.015603631989148796\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6100682593856656,\n \"acc_stderr\": 0.014252959848892894,\n\
\ \"acc_norm\": 0.6407849829351536,\n \"acc_norm_stderr\": 0.014020224155839154\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5891256721768572,\n\
\ \"acc_stderr\": 0.004909870006388837,\n \"acc_norm\": 0.7745469030073691,\n\
\ \"acc_norm_stderr\": 0.004170263338211791\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n\
\ \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6458333333333334,\n\
\ \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.6458333333333334,\n\
\ \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.03268335899936336,\n\
\ \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.03268335899936336\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n\
\ \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n\
\ \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.455026455026455,\n \"acc_stderr\": 0.025646928361049398,\n \"\
acc_norm\": 0.455026455026455,\n \"acc_norm_stderr\": 0.025646928361049398\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6967741935483871,\n\
\ \"acc_stderr\": 0.026148685930671742,\n \"acc_norm\": 0.6967741935483871,\n\
\ \"acc_norm_stderr\": 0.026148685930671742\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.035107665979592154,\n\
\ \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.036974422050315967,\n\
\ \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.036974422050315967\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7222222222222222,\n \"acc_stderr\": 0.03191178226713547,\n \"\
acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03191178226713547\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8031088082901554,\n \"acc_stderr\": 0.02869787397186067,\n\
\ \"acc_norm\": 0.8031088082901554,\n \"acc_norm_stderr\": 0.02869787397186067\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6051282051282051,\n \"acc_stderr\": 0.02478431694215639,\n \
\ \"acc_norm\": 0.6051282051282051,\n \"acc_norm_stderr\": 0.02478431694215639\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131137,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131137\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.03156663099215416,\n \
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.03156663099215416\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659807,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659807\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.818348623853211,\n \"acc_stderr\": 0.016530617409266875,\n \"\
acc_norm\": 0.818348623853211,\n \"acc_norm_stderr\": 0.016530617409266875\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"\
acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6862745098039216,\n \"acc_stderr\": 0.032566854844603886,\n \"\
acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.032566854844603886\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.03219079200419996,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.03219079200419996\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677697,\n\
\ \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677697\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912046,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912046\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7948717948717948,\n\
\ \"acc_stderr\": 0.026453508054040332,\n \"acc_norm\": 0.7948717948717948,\n\
\ \"acc_norm_stderr\": 0.026453508054040332\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6960408684546615,\n\
\ \"acc_stderr\": 0.016448321686769043,\n \"acc_norm\": 0.6960408684546615,\n\
\ \"acc_norm_stderr\": 0.016448321686769043\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.025361168749688235,\n\
\ \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.025361168749688235\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27932960893854747,\n\
\ \"acc_stderr\": 0.015005762446786164,\n \"acc_norm\": 0.27932960893854747,\n\
\ \"acc_norm_stderr\": 0.015005762446786164\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6405228758169934,\n \"acc_stderr\": 0.027475969910660952,\n\
\ \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.027475969910660952\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6302250803858521,\n\
\ \"acc_stderr\": 0.027417996705630995,\n \"acc_norm\": 0.6302250803858521,\n\
\ \"acc_norm_stderr\": 0.027417996705630995\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6450617283950617,\n \"acc_stderr\": 0.02662415247884585,\n\
\ \"acc_norm\": 0.6450617283950617,\n \"acc_norm_stderr\": 0.02662415247884585\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666904,\n \
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666904\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4198174706649283,\n\
\ \"acc_stderr\": 0.012604960816087378,\n \"acc_norm\": 0.4198174706649283,\n\
\ \"acc_norm_stderr\": 0.012604960816087378\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5220588235294118,\n \"acc_stderr\": 0.03034326422421352,\n\
\ \"acc_norm\": 0.5220588235294118,\n \"acc_norm_stderr\": 0.03034326422421352\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5571895424836601,\n \"acc_stderr\": 0.020095083154577354,\n \
\ \"acc_norm\": 0.5571895424836601,\n \"acc_norm_stderr\": 0.020095083154577354\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.02904308868330433,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.02904308868330433\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n\
\ \"acc_stderr\": 0.02947525023601718,\n \"acc_norm\": 0.7761194029850746,\n\
\ \"acc_norm_stderr\": 0.02947525023601718\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.034462962170884265,\n\
\ \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.034462962170884265\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37209302325581395,\n\
\ \"mc1_stderr\": 0.016921090118814035,\n \"mc2\": 0.5119014338994788,\n\
\ \"mc2_stderr\": 0.015603631989148796\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7332280978689818,\n \"acc_stderr\": 0.01243004610214433\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5223654283548143,\n \
\ \"acc_stderr\": 0.01375869948591184\n }\n}\n```"
repo_url: https://huggingface.co/DUAL-GPO/phi-2-dpo-renew1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|arc:challenge|25_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|gsm8k|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hellaswag|10_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T23-06-52.610510.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T23-06-52.610510.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- '**/details_harness|winogrande|5_2024-04-15T23-06-52.610510.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T23-06-52.610510.parquet'
- config_name: results
data_files:
- split: 2024_04_15T23_06_52.610510
path:
- results_2024-04-15T23-06-52.610510.parquet
- split: latest
path:
- results_2024-04-15T23-06-52.610510.parquet
---
# Dataset Card for Evaluation run of DUAL-GPO/phi-2-dpo-renew1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DUAL-GPO/phi-2-dpo-renew1](https://huggingface.co/DUAL-GPO/phi-2-dpo-renew1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DUAL-GPO__phi-2-dpo-renew1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T23:06:52.610510](https://huggingface.co/datasets/open-llm-leaderboard/details_DUAL-GPO__phi-2-dpo-renew1/blob/main/results_2024-04-15T23-06-52.610510.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.585468211023594,
"acc_stderr": 0.033738595435610585,
"acc_norm": 0.5876967000100922,
"acc_norm_stderr": 0.034421919213948445,
"mc1": 0.37209302325581395,
"mc1_stderr": 0.016921090118814035,
"mc2": 0.5119014338994788,
"mc2_stderr": 0.015603631989148796
},
"harness|arc:challenge|25": {
"acc": 0.6100682593856656,
"acc_stderr": 0.014252959848892894,
"acc_norm": 0.6407849829351536,
"acc_norm_stderr": 0.014020224155839154
},
"harness|hellaswag|10": {
"acc": 0.5891256721768572,
"acc_stderr": 0.004909870006388837,
"acc_norm": 0.7745469030073691,
"acc_norm_stderr": 0.004170263338211791
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791197,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6458333333333334,
"acc_stderr": 0.039994111357535424,
"acc_norm": 0.6458333333333334,
"acc_norm_stderr": 0.039994111357535424
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5063829787234042,
"acc_stderr": 0.03268335899936336,
"acc_norm": 0.5063829787234042,
"acc_norm_stderr": 0.03268335899936336
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.455026455026455,
"acc_stderr": 0.025646928361049398,
"acc_norm": 0.455026455026455,
"acc_norm_stderr": 0.025646928361049398
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6967741935483871,
"acc_stderr": 0.026148685930671742,
"acc_norm": 0.6967741935483871,
"acc_norm_stderr": 0.026148685930671742
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6606060606060606,
"acc_stderr": 0.036974422050315967,
"acc_norm": 0.6606060606060606,
"acc_norm_stderr": 0.036974422050315967
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03191178226713547,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03191178226713547
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8031088082901554,
"acc_stderr": 0.02869787397186067,
"acc_norm": 0.8031088082901554,
"acc_norm_stderr": 0.02869787397186067
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6051282051282051,
"acc_stderr": 0.02478431694215639,
"acc_norm": 0.6051282051282051,
"acc_norm_stderr": 0.02478431694215639
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131137,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131137
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.03156663099215416,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.03156663099215416
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659807,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659807
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.818348623853211,
"acc_stderr": 0.016530617409266875,
"acc_norm": 0.818348623853211,
"acc_norm_stderr": 0.016530617409266875
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.032566854844603886,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.032566854844603886
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.03219079200419996,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.03219079200419996
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6870229007633588,
"acc_stderr": 0.04066962905677697,
"acc_norm": 0.6870229007633588,
"acc_norm_stderr": 0.04066962905677697
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.040261875275912046,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.040261875275912046
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7948717948717948,
"acc_stderr": 0.026453508054040332,
"acc_norm": 0.7948717948717948,
"acc_norm_stderr": 0.026453508054040332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6960408684546615,
"acc_stderr": 0.016448321686769043,
"acc_norm": 0.6960408684546615,
"acc_norm_stderr": 0.016448321686769043
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.025361168749688235,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.025361168749688235
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27932960893854747,
"acc_stderr": 0.015005762446786164,
"acc_norm": 0.27932960893854747,
"acc_norm_stderr": 0.015005762446786164
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.027475969910660952,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.027475969910660952
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6302250803858521,
"acc_stderr": 0.027417996705630995,
"acc_norm": 0.6302250803858521,
"acc_norm_stderr": 0.027417996705630995
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6450617283950617,
"acc_stderr": 0.02662415247884585,
"acc_norm": 0.6450617283950617,
"acc_norm_stderr": 0.02662415247884585
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666904,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4198174706649283,
"acc_stderr": 0.012604960816087378,
"acc_norm": 0.4198174706649283,
"acc_norm_stderr": 0.012604960816087378
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5220588235294118,
"acc_stderr": 0.03034326422421352,
"acc_norm": 0.5220588235294118,
"acc_norm_stderr": 0.03034326422421352
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5571895424836601,
"acc_stderr": 0.020095083154577354,
"acc_norm": 0.5571895424836601,
"acc_norm_stderr": 0.020095083154577354
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.02904308868330433,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.02904308868330433
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.02947525023601718,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.02947525023601718
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7192982456140351,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.7192982456140351,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37209302325581395,
"mc1_stderr": 0.016921090118814035,
"mc2": 0.5119014338994788,
"mc2_stderr": 0.015603631989148796
},
"harness|winogrande|5": {
"acc": 0.7332280978689818,
"acc_stderr": 0.01243004610214433
},
"harness|gsm8k|5": {
"acc": 0.5223654283548143,
"acc_stderr": 0.01375869948591184
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
yuan-sf63/word_label_0.2_16_P | ---
dataset_info:
features:
- name: text
dtype: string
- name: '0'
dtype: int64
- name: '1'
dtype: int64
- name: '2'
dtype: int64
- name: '3'
dtype: int64
- name: '4'
dtype: int64
- name: '5'
dtype: int64
- name: '6'
dtype: int64
- name: '7'
dtype: int64
- name: '8'
dtype: int64
- name: '9'
dtype: int64
- name: '10'
dtype: int64
- name: '11'
dtype: int64
- name: '12'
dtype: int64
- name: '13'
dtype: int64
- name: '14'
dtype: int64
- name: '15'
dtype: int64
splits:
- name: train
num_bytes: 8720066.7
num_examples: 48906
- name: validation
num_bytes: 968896.3
num_examples: 5434
download_size: 2525995
dataset_size: 9688963.0
---
# Dataset Card for "word_label_0.2_16_P"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ehartford__minotaur-llama2-13b-qlora | ---
pretty_name: Evaluation run of ehartford/minotaur-llama2-13b-qlora
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ehartford/minotaur-llama2-13b-qlora](https://huggingface.co/ehartford/minotaur-llama2-13b-qlora)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__minotaur-llama2-13b-qlora\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-18T15:04:43.110639](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__minotaur-llama2-13b-qlora/blob/main/results_2023-10-18T15-04-43.110639.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.08305369127516779,\n\
\ \"em_stderr\": 0.0028261230954209926,\n \"f1\": 0.14533661912751625,\n\
\ \"f1_stderr\": 0.003000368188887415,\n \"acc\": 0.4414884036541998,\n\
\ \"acc_stderr\": 0.010464953595556116\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.08305369127516779,\n \"em_stderr\": 0.0028261230954209926,\n\
\ \"f1\": 0.14533661912751625,\n \"f1_stderr\": 0.003000368188887415\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12054586808188021,\n \
\ \"acc_stderr\": 0.008968608285309073\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.011961298905803159\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ehartford/minotaur-llama2-13b-qlora
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|arc:challenge|25_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_18T15_04_43.110639
path:
- '**/details_harness|drop|3_2023-10-18T15-04-43.110639.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-18T15-04-43.110639.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_18T15_04_43.110639
path:
- '**/details_harness|gsm8k|5_2023-10-18T15-04-43.110639.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-18T15-04-43.110639.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hellaswag|10_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T01:34:00.982275.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T01:34:00.982275.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T01:34:00.982275.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_18T15_04_43.110639
path:
- '**/details_harness|winogrande|5_2023-10-18T15-04-43.110639.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-18T15-04-43.110639.parquet'
- config_name: results
data_files:
- split: 2023_08_18T01_34_00.982275
path:
- results_2023-08-18T01:34:00.982275.parquet
- split: 2023_10_18T15_04_43.110639
path:
- results_2023-10-18T15-04-43.110639.parquet
- split: latest
path:
- results_2023-10-18T15-04-43.110639.parquet
---
# Dataset Card for Evaluation run of ehartford/minotaur-llama2-13b-qlora
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/minotaur-llama2-13b-qlora
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/minotaur-llama2-13b-qlora](https://huggingface.co/ehartford/minotaur-llama2-13b-qlora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__minotaur-llama2-13b-qlora",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T15:04:43.110639](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__minotaur-llama2-13b-qlora/blob/main/results_2023-10-18T15-04-43.110639.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.08305369127516779,
"em_stderr": 0.0028261230954209926,
"f1": 0.14533661912751625,
"f1_stderr": 0.003000368188887415,
"acc": 0.4414884036541998,
"acc_stderr": 0.010464953595556116
},
"harness|drop|3": {
"em": 0.08305369127516779,
"em_stderr": 0.0028261230954209926,
"f1": 0.14533661912751625,
"f1_stderr": 0.003000368188887415
},
"harness|gsm8k|5": {
"acc": 0.12054586808188021,
"acc_stderr": 0.008968608285309073
},
"harness|winogrande|5": {
"acc": 0.7624309392265194,
"acc_stderr": 0.011961298905803159
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Minglii/e15 | ---
dataset_info:
features:
- name: data
struct:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 5112113
num_examples: 7800
download_size: 2914272
dataset_size: 5112113
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "e15"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_migtissera__Synthia-34B-v1.2 | ---
pretty_name: Evaluation run of migtissera/Synthia-34B-v1.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [migtissera/Synthia-34B-v1.2](https://huggingface.co/migtissera/Synthia-34B-v1.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__Synthia-34B-v1.2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-18T20:05:34.645170](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-34B-v1.2/blob/main/results_2023-09-18T20-05-34.645170.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5320903185183409,\n\
\ \"acc_stderr\": 0.03517517994960793,\n \"acc_norm\": 0.5358397153796313,\n\
\ \"acc_norm_stderr\": 0.03516397638431902,\n \"mc1\": 0.2998776009791922,\n\
\ \"mc1_stderr\": 0.01604035296671362,\n \"mc2\": 0.4467341818408572,\n\
\ \"mc2_stderr\": 0.014969799807071376\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5119453924914675,\n \"acc_stderr\": 0.014607220340597171,\n\
\ \"acc_norm\": 0.5486348122866894,\n \"acc_norm_stderr\": 0.01454210456995527\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5587532364070902,\n\
\ \"acc_stderr\": 0.00495521278783238,\n \"acc_norm\": 0.7432782314280024,\n\
\ \"acc_norm_stderr\": 0.004359318206428689\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.506578947368421,\n \"acc_stderr\": 0.040685900502249704,\n\
\ \"acc_norm\": 0.506578947368421,\n \"acc_norm_stderr\": 0.040685900502249704\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.5433962264150943,\n \"acc_stderr\": 0.030656748696739428,\n \
\ \"acc_norm\": 0.5433962264150943,\n \"acc_norm_stderr\": 0.030656748696739428\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5694444444444444,\n\
\ \"acc_stderr\": 0.04140685639111502,\n \"acc_norm\": 0.5694444444444444,\n\
\ \"acc_norm_stderr\": 0.04140685639111502\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5028901734104047,\n\
\ \"acc_stderr\": 0.03812400565974834,\n \"acc_norm\": 0.5028901734104047,\n\
\ \"acc_norm_stderr\": 0.03812400565974834\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.045126085985421296,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.045126085985421296\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997692,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997692\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145634,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145634\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6225806451612903,\n\
\ \"acc_stderr\": 0.027575960723278243,\n \"acc_norm\": 0.6225806451612903,\n\
\ \"acc_norm_stderr\": 0.027575960723278243\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3842364532019704,\n \"acc_stderr\": 0.0342239856565755,\n\
\ \"acc_norm\": 0.3842364532019704,\n \"acc_norm_stderr\": 0.0342239856565755\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.03697442205031596,\n\
\ \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.03697442205031596\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.033586181457325226,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.033586181457325226\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7305699481865285,\n \"acc_stderr\": 0.032018671228777947,\n\
\ \"acc_norm\": 0.7305699481865285,\n \"acc_norm_stderr\": 0.032018671228777947\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5128205128205128,\n \"acc_stderr\": 0.02534267129380725,\n \
\ \"acc_norm\": 0.5128205128205128,\n \"acc_norm_stderr\": 0.02534267129380725\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114982,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114982\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5672268907563025,\n \"acc_stderr\": 0.032183581077426124,\n\
\ \"acc_norm\": 0.5672268907563025,\n \"acc_norm_stderr\": 0.032183581077426124\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659806,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659806\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6954128440366972,\n \"acc_stderr\": 0.019732299420354052,\n \"\
acc_norm\": 0.6954128440366972,\n \"acc_norm_stderr\": 0.019732299420354052\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4027777777777778,\n \"acc_stderr\": 0.033448873829978666,\n \"\
acc_norm\": 0.4027777777777778,\n \"acc_norm_stderr\": 0.033448873829978666\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7107843137254902,\n \"acc_stderr\": 0.031822318676475544,\n \"\
acc_norm\": 0.7107843137254902,\n \"acc_norm_stderr\": 0.031822318676475544\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7172995780590717,\n \"acc_stderr\": 0.02931281415395593,\n \
\ \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.02931281415395593\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5426008968609866,\n\
\ \"acc_stderr\": 0.033435777055830646,\n \"acc_norm\": 0.5426008968609866,\n\
\ \"acc_norm_stderr\": 0.033435777055830646\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5343511450381679,\n \"acc_stderr\": 0.043749285605997376,\n\
\ \"acc_norm\": 0.5343511450381679,\n \"acc_norm_stderr\": 0.043749285605997376\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6942148760330579,\n \"acc_stderr\": 0.04205953933884123,\n \"\
acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.04205953933884123\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6481481481481481,\n\
\ \"acc_stderr\": 0.04616631111801714,\n \"acc_norm\": 0.6481481481481481,\n\
\ \"acc_norm_stderr\": 0.04616631111801714\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6380368098159509,\n \"acc_stderr\": 0.037757007291414416,\n\
\ \"acc_norm\": 0.6380368098159509,\n \"acc_norm_stderr\": 0.037757007291414416\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.046897659372781335,\n\
\ \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.046897659372781335\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7692307692307693,\n\
\ \"acc_stderr\": 0.0276019213814176,\n \"acc_norm\": 0.7692307692307693,\n\
\ \"acc_norm_stderr\": 0.0276019213814176\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6743295019157088,\n\
\ \"acc_stderr\": 0.016757989458549675,\n \"acc_norm\": 0.6743295019157088,\n\
\ \"acc_norm_stderr\": 0.016757989458549675\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5867052023121387,\n \"acc_stderr\": 0.02651126136940925,\n\
\ \"acc_norm\": 0.5867052023121387,\n \"acc_norm_stderr\": 0.02651126136940925\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35977653631284917,\n\
\ \"acc_stderr\": 0.016051419760310263,\n \"acc_norm\": 0.35977653631284917,\n\
\ \"acc_norm_stderr\": 0.016051419760310263\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.028580341065138296,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.028580341065138296\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6141479099678456,\n\
\ \"acc_stderr\": 0.027648149599751464,\n \"acc_norm\": 0.6141479099678456,\n\
\ \"acc_norm_stderr\": 0.027648149599751464\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5524691358024691,\n \"acc_stderr\": 0.027667138569422708,\n\
\ \"acc_norm\": 0.5524691358024691,\n \"acc_norm_stderr\": 0.027667138569422708\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.37943262411347517,\n \"acc_stderr\": 0.028947338851614105,\n \
\ \"acc_norm\": 0.37943262411347517,\n \"acc_norm_stderr\": 0.028947338851614105\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.378748370273794,\n\
\ \"acc_stderr\": 0.012389052105003732,\n \"acc_norm\": 0.378748370273794,\n\
\ \"acc_norm_stderr\": 0.012389052105003732\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.40808823529411764,\n \"acc_stderr\": 0.029855261393483924,\n\
\ \"acc_norm\": 0.40808823529411764,\n \"acc_norm_stderr\": 0.029855261393483924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.48366013071895425,\n \"acc_stderr\": 0.020217030653186457,\n \
\ \"acc_norm\": 0.48366013071895425,\n \"acc_norm_stderr\": 0.020217030653186457\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5727272727272728,\n\
\ \"acc_stderr\": 0.047381987035454834,\n \"acc_norm\": 0.5727272727272728,\n\
\ \"acc_norm_stderr\": 0.047381987035454834\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387638,\n\
\ \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387638\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n\
\ \"acc_stderr\": 0.031343283582089536,\n \"acc_norm\": 0.7313432835820896,\n\
\ \"acc_norm_stderr\": 0.031343283582089536\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03615507630310935,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03615507630310935\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2998776009791922,\n\
\ \"mc1_stderr\": 0.01604035296671362,\n \"mc2\": 0.4467341818408572,\n\
\ \"mc2_stderr\": 0.014969799807071376\n }\n}\n```"
repo_url: https://huggingface.co/migtissera/Synthia-34B-v1.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|arc:challenge|25_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hellaswag|10_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T20-05-34.645170.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T20-05-34.645170.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T20-05-34.645170.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T20-05-34.645170.parquet'
- config_name: results
data_files:
- split: 2023_09_18T20_05_34.645170
path:
- results_2023-09-18T20-05-34.645170.parquet
- split: latest
path:
- results_2023-09-18T20-05-34.645170.parquet
---
# Dataset Card for Evaluation run of migtissera/Synthia-34B-v1.2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/migtissera/Synthia-34B-v1.2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [migtissera/Synthia-34B-v1.2](https://huggingface.co/migtissera/Synthia-34B-v1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_migtissera__Synthia-34B-v1.2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T20:05:34.645170](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-34B-v1.2/blob/main/results_2023-09-18T20-05-34.645170.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5320903185183409,
"acc_stderr": 0.03517517994960793,
"acc_norm": 0.5358397153796313,
"acc_norm_stderr": 0.03516397638431902,
"mc1": 0.2998776009791922,
"mc1_stderr": 0.01604035296671362,
"mc2": 0.4467341818408572,
"mc2_stderr": 0.014969799807071376
},
"harness|arc:challenge|25": {
"acc": 0.5119453924914675,
"acc_stderr": 0.014607220340597171,
"acc_norm": 0.5486348122866894,
"acc_norm_stderr": 0.01454210456995527
},
"harness|hellaswag|10": {
"acc": 0.5587532364070902,
"acc_stderr": 0.00495521278783238,
"acc_norm": 0.7432782314280024,
"acc_norm_stderr": 0.004359318206428689
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.506578947368421,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.506578947368421,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5433962264150943,
"acc_stderr": 0.030656748696739428,
"acc_norm": 0.5433962264150943,
"acc_norm_stderr": 0.030656748696739428
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.04140685639111502,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.04140685639111502
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5028901734104047,
"acc_stderr": 0.03812400565974834,
"acc_norm": 0.5028901734104047,
"acc_norm_stderr": 0.03812400565974834
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997692,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997692
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145634,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145634
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6225806451612903,
"acc_stderr": 0.027575960723278243,
"acc_norm": 0.6225806451612903,
"acc_norm_stderr": 0.027575960723278243
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3842364532019704,
"acc_stderr": 0.0342239856565755,
"acc_norm": 0.3842364532019704,
"acc_norm_stderr": 0.0342239856565755
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6606060606060606,
"acc_stderr": 0.03697442205031596,
"acc_norm": 0.6606060606060606,
"acc_norm_stderr": 0.03697442205031596
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.033586181457325226,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.033586181457325226
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7305699481865285,
"acc_stderr": 0.032018671228777947,
"acc_norm": 0.7305699481865285,
"acc_norm_stderr": 0.032018671228777947
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5128205128205128,
"acc_stderr": 0.02534267129380725,
"acc_norm": 0.5128205128205128,
"acc_norm_stderr": 0.02534267129380725
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114982,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114982
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5672268907563025,
"acc_stderr": 0.032183581077426124,
"acc_norm": 0.5672268907563025,
"acc_norm_stderr": 0.032183581077426124
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659806,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6954128440366972,
"acc_stderr": 0.019732299420354052,
"acc_norm": 0.6954128440366972,
"acc_norm_stderr": 0.019732299420354052
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4027777777777778,
"acc_stderr": 0.033448873829978666,
"acc_norm": 0.4027777777777778,
"acc_norm_stderr": 0.033448873829978666
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7107843137254902,
"acc_stderr": 0.031822318676475544,
"acc_norm": 0.7107843137254902,
"acc_norm_stderr": 0.031822318676475544
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7172995780590717,
"acc_stderr": 0.02931281415395593,
"acc_norm": 0.7172995780590717,
"acc_norm_stderr": 0.02931281415395593
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5426008968609866,
"acc_stderr": 0.033435777055830646,
"acc_norm": 0.5426008968609866,
"acc_norm_stderr": 0.033435777055830646
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5343511450381679,
"acc_stderr": 0.043749285605997376,
"acc_norm": 0.5343511450381679,
"acc_norm_stderr": 0.043749285605997376
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.04205953933884123,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.04205953933884123
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.04616631111801714,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.04616631111801714
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6380368098159509,
"acc_stderr": 0.037757007291414416,
"acc_norm": 0.6380368098159509,
"acc_norm_stderr": 0.037757007291414416
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.046897659372781335,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.046897659372781335
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7692307692307693,
"acc_stderr": 0.0276019213814176,
"acc_norm": 0.7692307692307693,
"acc_norm_stderr": 0.0276019213814176
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6743295019157088,
"acc_stderr": 0.016757989458549675,
"acc_norm": 0.6743295019157088,
"acc_norm_stderr": 0.016757989458549675
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5867052023121387,
"acc_stderr": 0.02651126136940925,
"acc_norm": 0.5867052023121387,
"acc_norm_stderr": 0.02651126136940925
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35977653631284917,
"acc_stderr": 0.016051419760310263,
"acc_norm": 0.35977653631284917,
"acc_norm_stderr": 0.016051419760310263
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.028580341065138296,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.028580341065138296
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6141479099678456,
"acc_stderr": 0.027648149599751464,
"acc_norm": 0.6141479099678456,
"acc_norm_stderr": 0.027648149599751464
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5524691358024691,
"acc_stderr": 0.027667138569422708,
"acc_norm": 0.5524691358024691,
"acc_norm_stderr": 0.027667138569422708
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.37943262411347517,
"acc_stderr": 0.028947338851614105,
"acc_norm": 0.37943262411347517,
"acc_norm_stderr": 0.028947338851614105
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.378748370273794,
"acc_stderr": 0.012389052105003732,
"acc_norm": 0.378748370273794,
"acc_norm_stderr": 0.012389052105003732
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.40808823529411764,
"acc_stderr": 0.029855261393483924,
"acc_norm": 0.40808823529411764,
"acc_norm_stderr": 0.029855261393483924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.48366013071895425,
"acc_stderr": 0.020217030653186457,
"acc_norm": 0.48366013071895425,
"acc_norm_stderr": 0.020217030653186457
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5727272727272728,
"acc_stderr": 0.047381987035454834,
"acc_norm": 0.5727272727272728,
"acc_norm_stderr": 0.047381987035454834
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387638,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7313432835820896,
"acc_stderr": 0.031343283582089536,
"acc_norm": 0.7313432835820896,
"acc_norm_stderr": 0.031343283582089536
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03615507630310935,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03615507630310935
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2998776009791922,
"mc1_stderr": 0.01604035296671362,
"mc2": 0.4467341818408572,
"mc2_stderr": 0.014969799807071376
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
autoevaluate/autoeval-eval-cnn_dailymail-3.0.0-5c4aa4-2355874139 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- cnn_dailymail
eval_info:
task: summarization
model: google/pegasus-cnn_dailymail
metrics: ['meteor']
dataset_name: cnn_dailymail
dataset_config: 3.0.0
dataset_split: test
col_mapping:
text: article
target: highlights
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: google/pegasus-cnn_dailymail
* Dataset: cnn_dailymail
* Config: 3.0.0
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@AkankshaK](https://huggingface.co/AkankshaK) for evaluating this model. |
heliosprime/twitter_dataset_1713194896 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 18893
num_examples: 54
download_size: 17999
dataset_size: 18893
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713194896"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gykim80/laondataset | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 9704
num_examples: 28
download_size: 8881
dataset_size: 9704
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-101000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 643508
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
BeIR/climate-fever-generated-queries | ---
annotations_creators: []
language_creators: []
language:
- en
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
paperswithcode_id: beir
pretty_name: BEIR Benchmark
size_categories:
msmarco:
- 1M<n<10M
trec-covid:
- 100k<n<1M
nfcorpus:
- 1K<n<10K
nq:
- 1M<n<10M
hotpotqa:
- 1M<n<10M
fiqa:
- 10K<n<100K
arguana:
- 1K<n<10K
touche-2020:
- 100K<n<1M
cqadupstack:
- 100K<n<1M
quora:
- 100K<n<1M
dbpedia:
- 1M<n<10M
scidocs:
- 10K<n<100K
fever:
- 1M<n<10M
climate-fever:
- 1M<n<10M
scifact:
- 1K<n<10K
source_datasets: []
task_categories:
- text-retrieval
- zero-shot-retrieval
- information-retrieval
- zero-shot-information-retrieval
task_ids:
- passage-retrieval
- entity-linking-retrieval
- fact-checking-retrieval
- tweet-retrieval
- citation-prediction-retrieval
- duplication-question-retrieval
- argument-retrieval
- news-retrieval
- biomedical-information-retrieval
- question-answering-retrieval
---
# Dataset Card for BEIR Benchmark
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/UKPLab/beir
- **Repository:** https://github.com/UKPLab/beir
- **Paper:** https://openreview.net/forum?id=wCu6T5xFjeJ
- **Leaderboard:** https://docs.google.com/spreadsheets/d/1L8aACyPaXrL8iEelJLGqlMqXKPX2oSP_R10pZoy77Ns
- **Point of Contact:** nandan.thakur@uwaterloo.ca
### Dataset Summary
BEIR is a heterogeneous benchmark that has been built from 18 diverse datasets representing 9 information retrieval tasks:
- Fact-checking: [FEVER](http://fever.ai), [Climate-FEVER](http://climatefever.ai), [SciFact](https://github.com/allenai/scifact)
- Question-Answering: [NQ](https://ai.google.com/research/NaturalQuestions), [HotpotQA](https://hotpotqa.github.io), [FiQA-2018](https://sites.google.com/view/fiqa/)
- Bio-Medical IR: [TREC-COVID](https://ir.nist.gov/covidSubmit/index.html), [BioASQ](http://bioasq.org), [NFCorpus](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/)
- News Retrieval: [TREC-NEWS](https://trec.nist.gov/data/news2019.html), [Robust04](https://trec.nist.gov/data/robust/04.guidelines.html)
- Argument Retrieval: [Touche-2020](https://webis.de/events/touche-20/shared-task-1.html), [ArguAna](tp://argumentation.bplaced.net/arguana/data)
- Duplicate Question Retrieval: [Quora](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs), [CqaDupstack](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/)
- Citation-Prediction: [SCIDOCS](https://allenai.org/data/scidocs)
- Tweet Retrieval: [Signal-1M](https://research.signal-ai.com/datasets/signal1m-tweetir.html)
- Entity Retrieval: [DBPedia](https://github.com/iai-group/DBpedia-Entity/)
All these datasets have been preprocessed and can be used for your experiments.
```python
```
### Supported Tasks and Leaderboards
The dataset supports a leaderboard that evaluates models against task-specific metrics such as F1 or EM, as well as their ability to retrieve supporting information from Wikipedia.
The current best performing models can be found [here](https://eval.ai/web/challenges/challenge-page/689/leaderboard/).
### Languages
All tasks are in English (`en`).
## Dataset Structure
All BEIR datasets must contain a corpus, queries and qrels (relevance judgments file). They must be in the following format:
- `corpus` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with three fields `_id` with unique document identifier, `title` with document title (optional) and `text` with document paragraph or passage. For example: `{"_id": "doc1", "title": "Albert Einstein", "text": "Albert Einstein was a German-born...."}`
- `queries` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with two fields `_id` with unique query identifier and `text` with query text. For example: `{"_id": "q1", "text": "Who developed the mass-energy equivalence formula?"}`
- `qrels` file: a `.tsv` file (tab-seperated) that contains three columns, i.e. the `query-id`, `corpus-id` and `score` in this order. Keep 1st row as header. For example: `q1 doc1 1`
### Data Instances
A high level example of any beir dataset:
```python
corpus = {
"doc1" : {
"title": "Albert Einstein",
"text": "Albert Einstein was a German-born theoretical physicist. who developed the theory of relativity, \
one of the two pillars of modern physics (alongside quantum mechanics). His work is also known for \
its influence on the philosophy of science. He is best known to the general public for his mass–energy \
equivalence formula E = mc2, which has been dubbed 'the world's most famous equation'. He received the 1921 \
Nobel Prize in Physics 'for his services to theoretical physics, and especially for his discovery of the law \
of the photoelectric effect', a pivotal step in the development of quantum theory."
},
"doc2" : {
"title": "", # Keep title an empty string if not present
"text": "Wheat beer is a top-fermented beer which is brewed with a large proportion of wheat relative to the amount of \
malted barley. The two main varieties are German Weißbier and Belgian witbier; other types include Lambic (made\
with wild yeast), Berliner Weisse (a cloudy, sour beer), and Gose (a sour, salty beer)."
},
}
queries = {
"q1" : "Who developed the mass-energy equivalence formula?",
"q2" : "Which beer is brewed with a large proportion of wheat?"
}
qrels = {
"q1" : {"doc1": 1},
"q2" : {"doc2": 1},
}
```
### Data Fields
Examples from all configurations have the following features:
### Corpus
- `corpus`: a `dict` feature representing the document title and passage text, made up of:
- `_id`: a `string` feature representing the unique document id
- `title`: a `string` feature, denoting the title of the document.
- `text`: a `string` feature, denoting the text of the document.
### Queries
- `queries`: a `dict` feature representing the query, made up of:
- `_id`: a `string` feature representing the unique query id
- `text`: a `string` feature, denoting the text of the query.
### Qrels
- `qrels`: a `dict` feature representing the query document relevance judgements, made up of:
- `_id`: a `string` feature representing the query id
- `_id`: a `string` feature, denoting the document id.
- `score`: a `int32` feature, denoting the relevance judgement between query and document.
### Data Splits
| Dataset | Website| BEIR-Name | Type | Queries | Corpus | Rel D/Q | Down-load | md5 |
| -------- | -----| ---------| --------- | ----------- | ---------| ---------| :----------: | :------:|
| MSMARCO | [Homepage](https://microsoft.github.io/msmarco/)| ``msmarco`` | ``train``<br>``dev``<br>``test``| 6,980 | 8.84M | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/msmarco.zip) | ``444067daf65d982533ea17ebd59501e4`` |
| TREC-COVID | [Homepage](https://ir.nist.gov/covidSubmit/index.html)| ``trec-covid``| ``test``| 50| 171K| 493.5 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/trec-covid.zip) | ``ce62140cb23feb9becf6270d0d1fe6d1`` |
| NFCorpus | [Homepage](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) | ``nfcorpus`` | ``train``<br>``dev``<br>``test``| 323 | 3.6K | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nfcorpus.zip) | ``a89dba18a62ef92f7d323ec890a0d38d`` |
| BioASQ | [Homepage](http://bioasq.org) | ``bioasq``| ``train``<br>``test`` | 500 | 14.91M | 8.05 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#2-bioasq) |
| NQ | [Homepage](https://ai.google.com/research/NaturalQuestions) | ``nq``| ``train``<br>``test``| 3,452 | 2.68M | 1.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nq.zip) | ``d4d3d2e48787a744b6f6e691ff534307`` |
| HotpotQA | [Homepage](https://hotpotqa.github.io) | ``hotpotqa``| ``train``<br>``dev``<br>``test``| 7,405 | 5.23M | 2.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/hotpotqa.zip) | ``f412724f78b0d91183a0e86805e16114`` |
| FiQA-2018 | [Homepage](https://sites.google.com/view/fiqa/) | ``fiqa`` | ``train``<br>``dev``<br>``test``| 648 | 57K | 2.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fiqa.zip) | ``17918ed23cd04fb15047f73e6c3bd9d9`` |
| Signal-1M(RT) | [Homepage](https://research.signal-ai.com/datasets/signal1m-tweetir.html)| ``signal1m`` | ``test``| 97 | 2.86M | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#4-signal-1m) |
| TREC-NEWS | [Homepage](https://trec.nist.gov/data/news2019.html) | ``trec-news`` | ``test``| 57 | 595K | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#1-trec-news) |
| ArguAna | [Homepage](http://argumentation.bplaced.net/arguana/data) | ``arguana``| ``test`` | 1,406 | 8.67K | 1.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/arguana.zip) | ``8ad3e3c2a5867cdced806d6503f29b99`` |
| Touche-2020| [Homepage](https://webis.de/events/touche-20/shared-task-1.html) | ``webis-touche2020``| ``test``| 49 | 382K | 19.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/webis-touche2020.zip) | ``46f650ba5a527fc69e0a6521c5a23563`` |
| CQADupstack| [Homepage](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) | ``cqadupstack``| ``test``| 13,145 | 457K | 1.4 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/cqadupstack.zip) | ``4e41456d7df8ee7760a7f866133bda78`` |
| Quora| [Homepage](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs) | ``quora``| ``dev``<br>``test``| 10,000 | 523K | 1.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/quora.zip) | ``18fb154900ba42a600f84b839c173167`` |
| DBPedia | [Homepage](https://github.com/iai-group/DBpedia-Entity/) | ``dbpedia-entity``| ``dev``<br>``test``| 400 | 4.63M | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/dbpedia-entity.zip) | ``c2a39eb420a3164af735795df012ac2c`` |
| SCIDOCS| [Homepage](https://allenai.org/data/scidocs) | ``scidocs``| ``test``| 1,000 | 25K | 4.9 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scidocs.zip) | ``38121350fc3a4d2f48850f6aff52e4a9`` |
| FEVER | [Homepage](http://fever.ai) | ``fever``| ``train``<br>``dev``<br>``test``| 6,666 | 5.42M | 1.2| [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fever.zip) | ``5a818580227bfb4b35bb6fa46d9b6c03`` |
| Climate-FEVER| [Homepage](http://climatefever.ai) | ``climate-fever``|``test``| 1,535 | 5.42M | 3.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/climate-fever.zip) | ``8b66f0a9126c521bae2bde127b4dc99d`` |
| SciFact| [Homepage](https://github.com/allenai/scifact) | ``scifact``| ``train``<br>``test``| 300 | 5K | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scifact.zip) | ``5f7d1de60b170fc8027bb7898e2efca1`` |
| Robust04 | [Homepage](https://trec.nist.gov/data/robust/04.guidelines.html) | ``robust04``| ``test``| 249 | 528K | 69.9 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#3-robust04) |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
Cite as:
```
@inproceedings{
thakur2021beir,
title={{BEIR}: A Heterogeneous Benchmark for Zero-shot Evaluation of Information Retrieval Models},
author={Nandan Thakur and Nils Reimers and Andreas R{\"u}ckl{\'e} and Abhishek Srivastava and Iryna Gurevych},
booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2)},
year={2021},
url={https://openreview.net/forum?id=wCu6T5xFjeJ}
}
```
### Contributions
Thanks to [@Nthakur20](https://github.com/Nthakur20) for adding this dataset. |
AlexEgito/minhavoz | ---
license: openrail
---
|
Zhuoran918/testDonut | ---
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 4076109.0
num_examples: 6
- name: test
num_bytes: 546113.0
num_examples: 1
- name: validation
num_bytes: 647225.0
num_examples: 1
download_size: 4225252
dataset_size: 5269447.0
---
# Dataset Card for "testDonut"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-fed20ca6-7444804 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- wikiann
eval_info:
task: entity_extraction
model: transformersbook/xlm-roberta-base-finetuned-panx-all
metrics: ['matthews_correlation']
dataset_name: wikiann
dataset_config: en
dataset_split: test
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: transformersbook/xlm-roberta-base-finetuned-panx-all
* Dataset: wikiann
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
zoharli/sst2_priv | ---
dataset_info:
features:
- name: idx
dtype: int32
- name: sentence
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 514988
num_examples: 6734
download_size: 374542
dataset_size: 514988
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
OdiaGenAIdata/pre_train_odia_data | ---
extra_gated_prompt: You agree to use this dataset solely for non-commercial and research purposes.
extra_gated_fields:
Country: country
Specific date: date_picker
I want to use this dataset for:
type: select
options:
- Research
- Education
- label: Other
value: other
I agree to use this dataset for non-commercial use ONLY: checkbox
configs:
- config_name: wiki
data_files:
- split: train
path: wiki_odia_253Ks_4p11Mw.json
- config_name: oscar
data_files:
- split: train
path: oscar_odia_1p16Ms_25Mw.json
- config_name: paraphrasing
data_files:
- split: train
path: paraphrasing_odia_105Ks_2p3Mw.json
- config_name: indicQA
data_files:
- split: train
path: indicQA_odia_12Ks_184Kw.json
- config_name: sentiment_analysis
data_files:
- split: train
path: sentiment_analysis_odia_1Ks_34Kw.json
- config_name: odiaencorp
data_files:
- split: train
path: odiaencorp_85Ks_1p1Mw.json
- config_name: xp3
data_files:
- split: train
path: xp3_261Ks_4p9Mw.json
- config_name: samanantar
data_files:
- split: train
path: samanantar_odia_990Ks_10Mw.json
- config_name: cultureax
data_files:
- split: train
path: cultureax_odia_2p9Ks_49Mw.json
- config_name: pmo
data_files:
- split: train
path: pmo_data.json
- config_name: varta
data_files:
- split: validation
path: val_or_shard_01.json
- split: train
path: train_or_shard_01.json
size_categories:
- 1M<n<10M
---
The present dataset is compiled by using the following datasets:
# CultureaX
- Licesnse - ODC-By, CC0, [Paper](https://arxiv.org/pdf/2309.09400.pdf)
- Source - https://huggingface.co/datasets/uonlp/CulturaX/viewer/or?
- 49M tokens, 2.9M sentences
- Collection of different versions of Ocsar (Commom Crawl data) and mC4 dataset (Common Crawl's web crawl corpus). mC4 forms 66% of CulturaX dataset.
# IndicQA
- License - cc-by-4.0
- Source - https://huggingface.co/datasets/ai4bharat/IndicQA/viewer/indicqa.or
- 0.23M tokens, 15K sentences
- In-context question-answering in Indic languages.
# Odiaencorp
- License - CC BY-NC-SA 4.0
- Source - https://lindat.mff.cuni.cz/repository/xmlui/handle/11234/1-3211
- 1.1M tokens, 85K sentences
- Odia wiki, PM India, Odia Digital library, Odisha Govt. websites, Bilingual literature books.
# Oscar
- License - cc0-1.0, [Paper](https://arxiv.org/pdf/2201.06642.pdf)
- Source - https://huggingface.co/datasets/oscar-corpus/OSCAR-2201
- 25.7M tokens, 1.2M sentences
- Common crawl data.
# Paraphrasing
- License - cc-by-nc-4.0, [Paper](https://arxiv.org/abs/2203.05437)
- Source - https://huggingface.co/datasets/ai4bharat/IndicParaphrase/viewer/or
- 2.3M tokens, 0.105M sentences
- Rephrasing a given sentence/paragraph.
# PMO
- License - cc-by-nc-4.0
- Source - [PMO website](https://www.pmindia.gov.in/ory/)
- 2.2M tokens, 0.131M sentences
- Speech, Statements, Activities of PM of India.
# Samanantar
- License - cc-by-nc-4.0, [Paper](https://arxiv.org/abs/2104.05596)
- Source - https://huggingface.co/datasets/ai4bharat/samanantar/viewer/or
- 10.25M Tokens, 0.909M Sentences
- Collection of short sentences.
# Sentiment Analysis
- License - Unspecified
- Source - https://huggingface.co/datasets/ai4bharat/IndicSentiment/viewer/translation-or
- 34K Tokens, 1000 Sentences
- Generic sentences and their sentiment analysis.
# Varta
- License - cc
- Source - https://huggingface.co/datasets/rahular/varta
- 193.5M tokens, 14.19M sentences.
- News articles.
# Wiki
- License - cc-by-sa-3.0, gfdl
- Source - https://huggingface.co/datasets/wikimedia/wikipedia/viewer/20231101.or
- 4.1M tokens, 0.253M sentences
- Wikipedia Articles.
# XP3
- License - apache-2.0
- Source - https://huggingface.co/datasets/bigscience/xP3all
- 4.9M tokens, 0.26M sentences
- Generic statements machine-translated into Odia. |
HANTIFARAH/Hindawi-data | ---
dataset_info:
config_name: HANTIFARAH__Hindawi-Books-dataset__ar
features:
- name: text
dtype: string
- name: source
dtype: string
- name: metadata
dtype: string
splits:
- name: train
num_bytes: 1375922914
num_examples: 47460
download_size: 536858836
dataset_size: 1375922914
configs:
- config_name: HANTIFARAH__Hindawi-Books-dataset__ar
data_files:
- split: train
path: HANTIFARAH__Hindawi-Books-dataset__ar/train-*
---
|
MohammedNasri/cv11_ar_mix_denoised | ---
dataset_info:
features:
- name: audio
sequence: float64
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 5817636498
num_examples: 10440
download_size: 2862660736
dataset_size: 5817636498
---
# Dataset Card for "cv11_ar_mix_denoised"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Tensoic/Saraswati_Short | ---
dataset_info:
features:
- name: response
dtype: string
- name: input
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 600729011
num_examples: 317286
download_size: 254173357
dataset_size: 600729011
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/af958ea3 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1337
dataset_size: 182
---
# Dataset Card for "af958ea3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/blanc_nikke | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of blanc/ブラン/布兰儿/블랑 (Nikke: Goddess of Victory)
This is the dataset of blanc/ブラン/布兰儿/블랑 (Nikke: Goddess of Victory), containing 186 images and their tags.
The core tags of this character are `breasts, long_hair, yellow_eyes, white_hair, bangs, animal_ears, rabbit_ears, fake_animal_ears, very_long_hair, large_breasts, tail, rabbit_tail, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 186 | 342.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/blanc_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 186 | 170.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/blanc_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 487 | 371.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/blanc_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 186 | 291.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/blanc_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 487 | 573.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/blanc_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/blanc_nikke',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, bare_shoulders, detached_collar, fishnet_pantyhose, looking_at_viewer, open_mouth, playboy_bunny, solo, white_leotard, black_necktie, cleavage, fang, short_necktie, strapless_leotard, white_pantyhose, simple_background, wrist_cuffs, :d, blunt_bangs, cowboy_shot, white_background |
| 1 | 5 |  |  |  |  |  | 1girl, ass, bare_shoulders, fake_tail, from_behind, looking_at_viewer, looking_back, playboy_bunny, solo, white_leotard, blush, fishnet_pantyhose, official_alternate_costume, open_mouth, strapless_leotard, white_pantyhose, wrist_cuffs, backless_leotard, sideboob, :d, bare_back, feet_out_of_frame, grey_hair, orange_eyes, simple_background, white_background |
| 2 | 6 |  |  |  |  |  | 1girl, long_sleeves, looking_at_viewer, solo, holding, kimono, open_mouth, wide_sleeves, black_pantyhose, :d, ass, black_footwear, blush, fang, horns |
| 3 | 5 |  |  |  |  |  | 1girl, hair_ornament, looking_at_viewer, simple_background, white_background, wide_sleeves, :d, cleavage, fang, long_sleeves, open_mouth, single_hair_bun, solo, black_bow, black_footwear, black_pantyhose, bodystocking, high_heels, index_finger_raised, standing_on_one_leg, ass, blush, hand_up, leg_up |
| 4 | 7 |  |  |  |  |  | 1boy, blush, hetero, nude, penis, solo_focus, 1girl, detached_collar, nipples, open_mouth, cum_on_breasts, black_necktie, playboy_bunny, spread_legs, sweat, wrist_cuffs, bar_censor, blunt_bangs, dark_skin, hair_intakes, leotard, lying, mosaic_censoring, navel, pussy, sex, short_necktie, vaginal |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | detached_collar | fishnet_pantyhose | looking_at_viewer | open_mouth | playboy_bunny | solo | white_leotard | black_necktie | cleavage | fang | short_necktie | strapless_leotard | white_pantyhose | simple_background | wrist_cuffs | :d | blunt_bangs | cowboy_shot | white_background | ass | fake_tail | from_behind | looking_back | blush | official_alternate_costume | backless_leotard | sideboob | bare_back | feet_out_of_frame | grey_hair | orange_eyes | long_sleeves | holding | kimono | wide_sleeves | black_pantyhose | black_footwear | horns | hair_ornament | single_hair_bun | black_bow | bodystocking | high_heels | index_finger_raised | standing_on_one_leg | hand_up | leg_up | 1boy | hetero | nude | penis | solo_focus | nipples | cum_on_breasts | spread_legs | sweat | bar_censor | dark_skin | hair_intakes | leotard | lying | mosaic_censoring | navel | pussy | sex | vaginal |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:------------------|:--------------------|:--------------------|:-------------|:----------------|:-------|:----------------|:----------------|:-----------|:-------|:----------------|:--------------------|:------------------|:--------------------|:--------------|:-----|:--------------|:--------------|:-------------------|:------|:------------|:--------------|:---------------|:--------|:-----------------------------|:-------------------|:-----------|:------------|:--------------------|:------------|:--------------|:---------------|:----------|:---------|:---------------|:------------------|:-----------------|:--------|:----------------|:------------------|:------------|:---------------|:-------------|:----------------------|:----------------------|:----------|:---------|:-------|:---------|:-------|:--------|:-------------|:----------|:-----------------|:--------------|:--------|:-------------|:------------|:---------------|:----------|:--------|:-------------------|:--------|:--------|:------|:----------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | | X | X | X | X | X | X | | | | | X | X | X | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | | | | X | X | | X | | | | X | | | | | | X | | | | X | | | | X | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | | | X | X | | X | | | X | X | | | | X | | X | | | X | X | | | | X | | | | | | | | X | | | X | X | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | | X | | | X | X | | | X | | | X | | | | X | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
jeffwong93/object_detection | ---
license: other
---
|
adrtee4bjak/common_voice_13_0_kk_pseudo_labelled | ---
dataset_info:
config_name: kk
features:
- name: client_id
dtype: string
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
- name: up_votes
dtype: int64
- name: down_votes
dtype: int64
- name: age
dtype: string
- name: gender
dtype: string
- name: accent
dtype: string
- name: locale
dtype: string
- name: segment
dtype: string
- name: variant
dtype: string
- name: whisper_transcript
sequence: int64
splits:
- name: train
num_bytes: 13923815.0
num_examples: 453
- name: validation
num_bytes: 10779581.0
num_examples: 369
- name: test
num_bytes: 12248711.0
num_examples: 396
download_size: 35172502
dataset_size: 36952107.0
configs:
- config_name: kk
data_files:
- split: train
path: kk/train-*
- split: validation
path: kk/validation-*
- split: test
path: kk/test-*
---
|
HydraLM/partitioned_v2_standardized_10 | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
splits:
- name: train
num_bytes: 11567301.11035192
num_examples: 24108
download_size: 14380705
dataset_size: 11567301.11035192
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "partitioned_v2_standardized_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/Imagenet1k_validation_google_flan_t5_xl_mode_T_SPECIFIC_A_ns_50000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0__Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_clip_tags_laion_ViT_H_14_2B_simple_specific_rices
num_bytes: 21188875
num_examples: 50000
- name: fewshot_0__Attributes_ViT_L_14_descriptors_text_davinci_003_full_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 22293724
num_examples: 50000
download_size: 16302328
dataset_size: 43482599
---
# Dataset Card for "Imagenet1k_validation_google_flan_t5_xl_mode_T_SPECIFIC_A_ns_50000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/hobby_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hobby/ホビー/霍比 (Azur Lane)
This is the dataset of hobby/ホビー/霍比 (Azur Lane), containing 54 images and their tags.
The core tags of this character are `long_hair, ribbon, bangs, hairband, hair_ribbon, very_long_hair, bow, red_eyes, black_hairband, purple_eyes, breasts, pink_hair, black_ribbon, animal_ears`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 54 | 46.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hobby_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 54 | 32.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hobby_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 126 | 68.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hobby_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 54 | 43.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hobby_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 126 | 87.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hobby_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hobby_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | white_shirt, 1girl, blush, collared_shirt, looking_at_viewer, pleated_skirt, smile, solo, black_choker, grey_skirt, pink_hairband, black_thighhighs, earrings, short_sleeves, collarbone, dress_shirt, tongue_out, white_background, open_mouth, pink_ribbon, school_uniform, garter_straps, one_eye_closed, bangle, closed_mouth, jacket, school_bag, simple_background, sleeves_rolled_up, torpedo |
| 1 | 23 |  |  |  |  |  | looking_at_viewer, 1girl, blush, solo, bare_shoulders, white_shirt, black_skirt, detached_sleeves, midriff, navel, sleeveless_shirt, smile, collared_shirt, long_sleeves, wide_sleeves, pink_thighhighs, pleated_skirt, suspenders, open_jacket, sleeves_past_fingers, white_background, black_jacket, crop_top, simple_background, belt_buckle, black_footwear, boots, closed_mouth, pink_belt, tongue_out, black_bow, heart, off_shoulder, open_mouth |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | white_shirt | 1girl | blush | collared_shirt | looking_at_viewer | pleated_skirt | smile | solo | black_choker | grey_skirt | pink_hairband | black_thighhighs | earrings | short_sleeves | collarbone | dress_shirt | tongue_out | white_background | open_mouth | pink_ribbon | school_uniform | garter_straps | one_eye_closed | bangle | closed_mouth | jacket | school_bag | simple_background | sleeves_rolled_up | torpedo | bare_shoulders | black_skirt | detached_sleeves | midriff | navel | sleeveless_shirt | long_sleeves | wide_sleeves | pink_thighhighs | suspenders | open_jacket | sleeves_past_fingers | black_jacket | crop_top | belt_buckle | black_footwear | boots | pink_belt | black_bow | heart | off_shoulder |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------|:--------|:--------|:-----------------|:--------------------|:----------------|:--------|:-------|:---------------|:-------------|:----------------|:-------------------|:-----------|:----------------|:-------------|:--------------|:-------------|:-------------------|:-------------|:--------------|:-----------------|:----------------|:-----------------|:---------|:---------------|:---------|:-------------|:--------------------|:--------------------|:----------|:-----------------|:--------------|:-------------------|:----------|:--------|:-------------------|:---------------|:---------------|:------------------|:-------------|:--------------|:-----------------------|:---------------|:-----------|:--------------|:-----------------|:--------|:------------|:------------|:--------|:---------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 1 | 23 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | | | | X | X | X | | | | | | X | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Nexdata/97_Hours_Brazilian_Portuguese_Child_Spontaneous_Speech_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
Portuguese(Brazil) Children Real-world Casual Conversation and Monologue speech dataset, covers self-media, conversation, live, lecture, variety show and other generic domains, mirrors real-world interactions. Transcribed with text content, speaker's ID, gender, age, accent and other attributes. Our dataset was collected from extensive and diversify speakers(12 years old and younger children), geographicly speaking, enhancing model performance in real and complex tasks.rnQuality tested by various AI companies. We strictly adhere to data protection regulations and privacy standards, ensuring the maintenance of user privacy and legal rights throughout the data collection, storage, and usage processes, our datasets are all GDPR, CCPA, PIPL complied.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1326?source=Huggingface
## Format
16kHz, 16 bit, wav, mono channel
## Age
12 years old and younger children
## Content category
including interview, self-meida,variety show, etc.
## Recording environment
Low background noise
## Country
Brazil(BR)
## Language(Region) Code
pt-BR
## Language
Portuguese
## Features of annotation
Transcription text, timestamp, speaker ID, gender, noise
## Accuracy
Word Accuracy Rate (WAR) 98%
# Licensing Information
Commercial License
|
pkr7098/bert-base-uncased-bookcorpus-wiki-2022030-en-vocab_size-32000 | ---
dataset_info:
config_name: truncate-512
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: special_tokens_mask
sequence: int8
splits:
- name: train
num_bytes: 23600541600
num_examples: 6555706
- name: validation
num_bytes: 317304000
num_examples: 88140
download_size: 310440269
dataset_size: 23917845600
configs:
- config_name: truncate-512
data_files:
- split: train
path: truncate-512/train-*
- split: validation
path: truncate-512/validation-*
---
# Dataset Card for "bert-base-uncased-bookcorpus-wiki-2022030-en-vocab_size-32000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Rimyy/problemMath-llama2 | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: texte
dtype: string
splits:
- name: train
num_bytes: 454646422
num_examples: 200035
download_size: 166931788
dataset_size: 454646422
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/marcia_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of marcia (Fire Emblem)
This is the dataset of marcia (Fire Emblem), containing 76 images and their tags.
The core tags of this character are `short_hair, pink_hair, blue_eyes, headband`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 76 | 53.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marcia_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 76 | 41.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marcia_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 109 | 57.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marcia_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 76 | 51.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marcia_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 109 | 68.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marcia_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/marcia_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, hetero, solo_focus, nipples, thighhighs, penis, 1boy, blush, mosaic_censoring, sex, medium_breasts, pussy, cum, vaginal, elbow_gloves, fingerless_gloves, large_breasts, sweat, armor, pegasus_knight_uniform_(fire_emblem), thigh_boots |
| 1 | 22 |  |  |  |  |  | 1girl, elbow_gloves, fingerless_gloves, solo, thighhighs, pegasus_knight_uniform_(fire_emblem), smile, open_mouth, thigh_boots, belt, breastplate, spear, shoulder_armor, looking_at_viewer, dress, zettai_ryouiki |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | hetero | solo_focus | nipples | thighhighs | penis | 1boy | blush | mosaic_censoring | sex | medium_breasts | pussy | cum | vaginal | elbow_gloves | fingerless_gloves | large_breasts | sweat | armor | pegasus_knight_uniform_(fire_emblem) | thigh_boots | solo | smile | open_mouth | belt | breastplate | spear | shoulder_armor | looking_at_viewer | dress | zettai_ryouiki |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:-------------|:----------|:-------------|:--------|:-------|:--------|:-------------------|:------|:-----------------|:--------|:------|:----------|:---------------|:--------------------|:----------------|:--------|:--------|:---------------------------------------|:--------------|:-------|:--------|:-------------|:-------|:--------------|:--------|:-----------------|:--------------------|:--------|:-----------------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 1 | 22 |  |  |  |  |  | X | | | | X | | | | | | | | | | X | X | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
thisisHJLee/human_essay | ---
license: apache-2.0
---
|
pnkvalavala/Labyrinth | ---
license: mit
language:
- en
tags:
- code
size_categories:
- 100K<n<1M
---
# Labyrinth Dataset
Labyrinth is a code dataset that combines three existing datasets without modifying the data itself but adapting the structure/format to streamline fine-tuning for [Zephyr](https://huggingface.co/HuggingFaceH4/zephyr-7b-beta) on code.
## Dataset Sources
Labyrinth is composed of code examples and instructions from the following three datasets:
1. [CodeAlpaca](https://github.com/sahil280114/codealpaca/blob/master/data/code_alpaca_20k.json) by [Sahil Chaudhary](https://huggingface.co/sahil2801).
2. [Codegen-instruct](https://github.com/teknium1/GPTeacher/blob/main/Codegen/codegen-instruct.json) by [Teknium](https://huggingface.co/teknium).
3. [llama-2-instruct-121k-code](https://huggingface.co/datasets/emre/llama-2-instruct-121k-code) by [Davut Emre TASAR](https://huggingface.co/emre). |
visswateza/text_to_sql_10k | ---
language:
- en
tags:
- sql
size_categories:
- 1K<n<10K
--- |
autoevaluate/autoeval-staging-eval-autoevaluate__zero-shot-classification-sample-autoevalu-acab52-16766274 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- autoevaluate/zero-shot-classification-sample
eval_info:
task: text_zero_shot_classification
model: autoevaluate/zero-shot-classification
metrics: []
dataset_name: autoevaluate/zero-shot-classification-sample
dataset_config: autoevaluate--zero-shot-classification-sample
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: autoevaluate/zero-shot-classification
* Dataset: autoevaluate/zero-shot-classification-sample
* Config: autoevaluate--zero-shot-classification-sample
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
adityarra07/live_ATC_DAL | ---
dataset_info:
features:
- name: id
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 30502481.0
num_examples: 28
download_size: 26150276
dataset_size: 30502481.0
---
# Dataset Card for "live_ATC_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sled-umich/Conversation-Entailment | ---
annotations_creators:
- expert-generated
language:
- en
language_creators:
- crowdsourced
license: []
multilinguality:
- monolingual
pretty_name: Conversation-Entailment
size_categories:
- n<1K
source_datasets:
- original
tags:
- conversational
- entailment
task_categories:
- conversational
- text-classification
task_ids: []
---
# Conversation-Entailment
Official dataset for [Towards Conversation Entailment: An Empirical Investigation](https://sled.eecs.umich.edu/publication/dblp-confemnlp-zhang-c-10/). *Chen Zhang, Joyce Chai*. EMNLP, 2010

## Overview
Textual entailment has mainly focused on inference from written text in monologue. Recent years also observed an increasing amount of conversational data such as conversation scripts of meetings, call center records, court proceedings, as well as online chatting. Although conversation is a form of language, it is different from monologue text with several unique characteristics. The key distinctive features include turn-taking between participants, grounding between participants, different linguistic phenomena of utterances, and conversation implicatures. Traditional approaches dealing with textual entailment were not designed to handle these unique conversation behaviors and thus to support automated entailment from conversation scripts. This project intends to address this limitation.
### Download
```python
from datasets import load_dataset
dataset = load_dataset("sled-umich/Conversation-Entailment")
```
* [HuggingFace-Dataset](https://huggingface.co/datasets/sled-umich/Conversation-Entailment)
* [DropBox](https://www.dropbox.com/s/z5vchgzvzxv75es/conversation_entailment.tar?dl=0)
### Data Sample
```json
{
"id": 3,
"type": "fact",
"dialog_num_list": [
30,
31
],
"dialog_speaker_list": [
"B",
"A"
],
"dialog_text_list": [
"Have you seen SLEEPING WITH THE ENEMY?",
"No. I've heard, I've heard that's really great, though."
],
"h": "SpeakerA and SpeakerB have seen SLEEPING WITH THE ENEMY",
"entailment": false,
"dialog_source": "SW2010"
}
```
### Cite
[Towards Conversation Entailment: An Empirical Investigation](https://sled.eecs.umich.edu/publication/dblp-confemnlp-zhang-c-10/). *Chen Zhang, Joyce Chai*. EMNLP, 2010. [[Paper]](https://aclanthology.org/D10-1074/)
```tex
@inproceedings{zhang-chai-2010-towards,
title = "Towards Conversation Entailment: An Empirical Investigation",
author = "Zhang, Chen and
Chai, Joyce",
booktitle = "Proceedings of the 2010 Conference on Empirical Methods in Natural Language Processing",
month = oct,
year = "2010",
address = "Cambridge, MA",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/D10-1074",
pages = "756--766",
}
``` |
CyberHarem/turner_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of turner (Fire Emblem)
This is the dataset of turner (Fire Emblem), containing 155 images and their tags.
The core tags of this character are `long_hair, blue_hair, braid, blue_eyes, ponytail, twin_braids, breasts, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 155 | 182.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/turner_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 155 | 116.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/turner_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 350 | 234.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/turner_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 155 | 167.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/turner_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 350 | 310.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/turner_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/turner_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 21 |  |  |  |  |  | 1girl, cleavage, solo, one-piece_swimsuit, smile, covered_navel, open_mouth, looking_at_viewer, medium_breasts, collarbone, day, holding, towel, blue_sky, blush, large_breasts, bangs, cloud, outdoors, side_braids |
| 1 | 7 |  |  |  |  |  | 1girl, pauldrons, solo, looking_at_viewer, simple_background, smile, upper_body, side_braid, choker, closed_mouth, twitter_username |
| 2 | 24 |  |  |  |  |  | 1girl, gloves, solo, armor, thighhighs, smile, zettai_ryouiki, open_mouth, pegasus_knight_uniform_(fire_emblem), skirt, dress |
| 3 | 10 |  |  |  |  |  | 1girl, bangs, full_body, solo, white_gloves, simple_background, open_mouth, short_dress, shiny_hair, zettai_ryouiki, looking_away, medium_breasts, white_thighhighs, cape, cleavage, fingerless_gloves, holding_bow_(weapon), shoulder_armor, skirt, smile, white_footwear, arrow_(projectile), choker, grey_background, puffy_short_sleeves, ribbon, spear, thigh_boots, white_background |
| 4 | 18 |  |  |  |  |  | 1girl, hetero, blush, nipples, solo_focus, 1boy, open_mouth, sex, large_breasts, vaginal, penis, cum_in_pussy, spread_legs, tears, rape, breast_grab, grabbing, mosaic_censoring, thighhighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | solo | one-piece_swimsuit | smile | covered_navel | open_mouth | looking_at_viewer | medium_breasts | collarbone | day | holding | towel | blue_sky | blush | large_breasts | bangs | cloud | outdoors | side_braids | pauldrons | simple_background | upper_body | side_braid | choker | closed_mouth | twitter_username | gloves | armor | thighhighs | zettai_ryouiki | pegasus_knight_uniform_(fire_emblem) | skirt | dress | full_body | white_gloves | short_dress | shiny_hair | looking_away | white_thighhighs | cape | fingerless_gloves | holding_bow_(weapon) | shoulder_armor | white_footwear | arrow_(projectile) | grey_background | puffy_short_sleeves | ribbon | spear | thigh_boots | white_background | hetero | nipples | solo_focus | 1boy | sex | vaginal | penis | cum_in_pussy | spread_legs | tears | rape | breast_grab | grabbing | mosaic_censoring |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:-------|:---------------------|:--------|:----------------|:-------------|:--------------------|:-----------------|:-------------|:------|:----------|:--------|:-----------|:--------|:----------------|:--------|:--------|:-----------|:--------------|:------------|:--------------------|:-------------|:-------------|:---------|:---------------|:-------------------|:---------|:--------|:-------------|:-----------------|:---------------------------------------|:--------|:--------|:------------|:---------------|:--------------|:-------------|:---------------|:-------------------|:-------|:--------------------|:-----------------------|:-----------------|:-----------------|:---------------------|:------------------|:----------------------|:---------|:--------|:--------------|:-------------------|:---------|:----------|:-------------|:-------|:------|:----------|:--------|:---------------|:--------------|:--------|:-------|:--------------|:-----------|:-------------------|
| 0 | 21 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | | X | | X | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 24 |  |  |  |  |  | X | | X | | X | | X | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 10 |  |  |  |  |  | X | X | X | | X | | X | | X | | | | | | | | X | | | | | X | | | X | | | | | | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 4 | 18 |  |  |  |  |  | X | | | | | | X | | | | | | | | X | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
deetsadi/processed_dwi_sobel | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: conditioning_image
dtype: image
splits:
- name: train
num_bytes: 25891869.0
num_examples: 200
download_size: 25895367
dataset_size: 25891869.0
---
# Dataset Card for "processed_dwi_sobel"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
quocanh34/result_with_w2v2_baseline | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: id
dtype: string
- name: w2v2_baseline_transcription
dtype: string
- name: w2v2_baseline_norm
dtype: string
splits:
- name: train
num_bytes: 174371487.625
num_examples: 1299
download_size: 164231228
dataset_size: 174371487.625
---
# Dataset Card for "result_with_w2v2_baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_KnutJaegersberg__MistralInstructLongish | ---
pretty_name: Evaluation run of KnutJaegersberg/MistralInstructLongish
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KnutJaegersberg/MistralInstructLongish](https://huggingface.co/KnutJaegersberg/MistralInstructLongish)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__MistralInstructLongish_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-18T18:06:36.075482](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__MistralInstructLongish_public/blob/main/results_2023-11-18T18-06-36.075482.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5973103007065012,\n\
\ \"acc_stderr\": 0.03292863737262349,\n \"acc_norm\": 0.6085223268477976,\n\
\ \"acc_norm_stderr\": 0.033764939289429516,\n \"mc1\": 0.2729498164014688,\n\
\ \"mc1_stderr\": 0.015594753632006526,\n \"mc2\": 0.4055061003617047,\n\
\ \"mc2_stderr\": 0.014261205384601018,\n \"em\": 0.06910654362416108,\n\
\ \"em_stderr\": 0.0025974621402952,\n \"f1\": 0.21221371644295373,\n\
\ \"f1_stderr\": 0.00318177597759032\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5537542662116041,\n \"acc_stderr\": 0.014526705548539978,\n\
\ \"acc_norm\": 0.6075085324232082,\n \"acc_norm_stderr\": 0.014269634635670717\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6246763592909779,\n\
\ \"acc_stderr\": 0.004832167854501645,\n \"acc_norm\": 0.8185620394343757,\n\
\ \"acc_norm_stderr\": 0.0038459301696437916\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.042320736951515885,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.042320736951515885\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.02854479331905533,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.02854479331905533\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266344,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266344\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3915343915343915,\n \"acc_stderr\": 0.02513809138885111,\n \"\
acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.02513809138885111\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6967741935483871,\n\
\ \"acc_stderr\": 0.02614868593067175,\n \"acc_norm\": 0.6967741935483871,\n\
\ \"acc_norm_stderr\": 0.02614868593067175\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.03502544650845872,\n\
\ \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.03502544650845872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.0356796977226805,\n\
\ \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.0356796977226805\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7424242424242424,\n \"acc_stderr\": 0.031156269519646836,\n \"\
acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.031156269519646836\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397436,\n\
\ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.024756000382130956,\n\
\ \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.024756000382130956\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131137,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131137\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.031968769891957786,\n\
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.031968769891957786\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7853211009174312,\n \"acc_stderr\": 0.017604304149256483,\n \"\
acc_norm\": 0.7853211009174312,\n \"acc_norm_stderr\": 0.017604304149256483\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4537037037037037,\n \"acc_stderr\": 0.03395322726375797,\n \"\
acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.03395322726375797\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159253,\n\
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159253\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912046,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912046\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n\
\ \"acc_stderr\": 0.02416161812798774,\n \"acc_norm\": 0.8376068376068376,\n\
\ \"acc_norm_stderr\": 0.02416161812798774\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7943805874840357,\n\
\ \"acc_stderr\": 0.01445250045678583,\n \"acc_norm\": 0.7943805874840357,\n\
\ \"acc_norm_stderr\": 0.01445250045678583\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.025992472029306386,\n\
\ \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.025992472029306386\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2581005586592179,\n\
\ \"acc_stderr\": 0.014635185616527822,\n \"acc_norm\": 0.2581005586592179,\n\
\ \"acc_norm_stderr\": 0.014635185616527822\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.02633661346904663,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.02633661346904663\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n\
\ \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n\
\ \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.026229649178821163,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.026229649178821163\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370593,\n \
\ \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370593\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.408735332464146,\n\
\ \"acc_stderr\": 0.01255570134670338,\n \"acc_norm\": 0.408735332464146,\n\
\ \"acc_norm_stderr\": 0.01255570134670338\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5919117647058824,\n \"acc_stderr\": 0.029855261393483924,\n\
\ \"acc_norm\": 0.5919117647058824,\n \"acc_norm_stderr\": 0.029855261393483924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6258169934640523,\n \"acc_stderr\": 0.019576953122088826,\n \
\ \"acc_norm\": 0.6258169934640523,\n \"acc_norm_stderr\": 0.019576953122088826\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.03086214492108757,\n\
\ \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.03086214492108757\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2729498164014688,\n\
\ \"mc1_stderr\": 0.015594753632006526,\n \"mc2\": 0.4055061003617047,\n\
\ \"mc2_stderr\": 0.014261205384601018\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7655880031570639,\n \"acc_stderr\": 0.011906130106237986\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.06910654362416108,\n \
\ \"em_stderr\": 0.0025974621402952,\n \"f1\": 0.21221371644295373,\n \
\ \"f1_stderr\": 0.00318177597759032\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.015163002274450341,\n \"acc_stderr\": 0.00336602294972636\n\
\ }\n}\n```"
repo_url: https://huggingface.co/KnutJaegersberg/MistralInstructLongish
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|arc:challenge|25_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|drop|3_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|gsm8k|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hellaswag|10_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-18T18-06-36.075482.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-18T18-06-36.075482.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- '**/details_harness|winogrande|5_2023-11-18T18-06-36.075482.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-18T18-06-36.075482.parquet'
- config_name: results
data_files:
- split: 2023_11_18T18_06_36.075482
path:
- results_2023-11-18T18-06-36.075482.parquet
- split: latest
path:
- results_2023-11-18T18-06-36.075482.parquet
---
# Dataset Card for Evaluation run of KnutJaegersberg/MistralInstructLongish
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KnutJaegersberg/MistralInstructLongish
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KnutJaegersberg/MistralInstructLongish](https://huggingface.co/KnutJaegersberg/MistralInstructLongish) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__MistralInstructLongish_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-18T18:06:36.075482](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__MistralInstructLongish_public/blob/main/results_2023-11-18T18-06-36.075482.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5973103007065012,
"acc_stderr": 0.03292863737262349,
"acc_norm": 0.6085223268477976,
"acc_norm_stderr": 0.033764939289429516,
"mc1": 0.2729498164014688,
"mc1_stderr": 0.015594753632006526,
"mc2": 0.4055061003617047,
"mc2_stderr": 0.014261205384601018,
"em": 0.06910654362416108,
"em_stderr": 0.0025974621402952,
"f1": 0.21221371644295373,
"f1_stderr": 0.00318177597759032
},
"harness|arc:challenge|25": {
"acc": 0.5537542662116041,
"acc_stderr": 0.014526705548539978,
"acc_norm": 0.6075085324232082,
"acc_norm_stderr": 0.014269634635670717
},
"harness|hellaswag|10": {
"acc": 0.6246763592909779,
"acc_stderr": 0.004832167854501645,
"acc_norm": 0.8185620394343757,
"acc_norm_stderr": 0.0038459301696437916
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.042320736951515885,
"acc_norm": 0.6,
"acc_norm_stderr": 0.042320736951515885
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.02854479331905533,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.02854479331905533
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266344,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266344
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.02513809138885111,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.02513809138885111
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574924,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574924
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6967741935483871,
"acc_stderr": 0.02614868593067175,
"acc_norm": 0.6967741935483871,
"acc_norm_stderr": 0.02614868593067175
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.703030303030303,
"acc_stderr": 0.0356796977226805,
"acc_norm": 0.703030303030303,
"acc_norm_stderr": 0.0356796977226805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.031156269519646836,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.031156269519646836
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.026499057701397436,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.026499057701397436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6076923076923076,
"acc_stderr": 0.024756000382130956,
"acc_norm": 0.6076923076923076,
"acc_norm_stderr": 0.024756000382130956
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131137,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131137
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.031968769891957786,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.031968769891957786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7853211009174312,
"acc_stderr": 0.017604304149256483,
"acc_norm": 0.7853211009174312,
"acc_norm_stderr": 0.017604304149256483
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.03395322726375797,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.03395322726375797
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159253,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159253
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.040261875275912046,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.040261875275912046
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8376068376068376,
"acc_stderr": 0.02416161812798774,
"acc_norm": 0.8376068376068376,
"acc_norm_stderr": 0.02416161812798774
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7943805874840357,
"acc_stderr": 0.01445250045678583,
"acc_norm": 0.7943805874840357,
"acc_norm_stderr": 0.01445250045678583
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.025992472029306386,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.025992472029306386
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2581005586592179,
"acc_stderr": 0.014635185616527822,
"acc_norm": 0.2581005586592179,
"acc_norm_stderr": 0.014635185616527822
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.02633661346904663,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.02633661346904663
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6527331189710611,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.6527331189710611,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.026229649178821163,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.026229649178821163
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370593,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370593
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.408735332464146,
"acc_stderr": 0.01255570134670338,
"acc_norm": 0.408735332464146,
"acc_norm_stderr": 0.01255570134670338
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5919117647058824,
"acc_stderr": 0.029855261393483924,
"acc_norm": 0.5919117647058824,
"acc_norm_stderr": 0.029855261393483924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6258169934640523,
"acc_stderr": 0.019576953122088826,
"acc_norm": 0.6258169934640523,
"acc_norm_stderr": 0.019576953122088826
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6326530612244898,
"acc_stderr": 0.03086214492108757,
"acc_norm": 0.6326530612244898,
"acc_norm_stderr": 0.03086214492108757
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2729498164014688,
"mc1_stderr": 0.015594753632006526,
"mc2": 0.4055061003617047,
"mc2_stderr": 0.014261205384601018
},
"harness|winogrande|5": {
"acc": 0.7655880031570639,
"acc_stderr": 0.011906130106237986
},
"harness|drop|3": {
"em": 0.06910654362416108,
"em_stderr": 0.0025974621402952,
"f1": 0.21221371644295373,
"f1_stderr": 0.00318177597759032
},
"harness|gsm8k|5": {
"acc": 0.015163002274450341,
"acc_stderr": 0.00336602294972636
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
FINNUMBER/FINCH_TRAIN_SA_ESG_400 | ---
dataset_info:
features:
- name: task
dtype: string
- name: context
dtype: string
- name: question
dtype: 'null'
- name: answer
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 3546289
num_examples: 400
download_size: 1933232
dataset_size: 3546289
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MicPie/unpredictable_cluster23 | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: UnpredicTable-cluster23
size_categories:
- 100K<n<1M
source_datasets: []
task_categories:
- multiple-choice
- question-answering
- zero-shot-classification
- text2text-generation
- table-question-answering
- text-generation
- text-classification
- tabular-classification
task_ids:
- multiple-choice-qa
- extractive-qa
- open-domain-qa
- closed-domain-qa
- closed-book-qa
- open-book-qa
- language-modeling
- multi-class-classification
- natural-language-inference
- topic-classification
- multi-label-classification
- tabular-multi-class-classification
- tabular-multi-label-classification
---
# Dataset Card for "UnpredicTable-cluster23" - Dataset of Few-shot Tasks from Tables
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** https://ethanperez.net/unpredictable
- **Repository:** https://github.com/JunShern/few-shot-adaptation
- **Paper:** Few-shot Adaptation Works with UnpredicTable Data
- **Point of Contact:** junshern@nyu.edu, perez@nyu.edu
### Dataset Summary
The UnpredicTable dataset consists of web tables formatted as few-shot tasks for fine-tuning language models to improve their few-shot performance.
There are several dataset versions available:
* [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full): Starting from the initial WTC corpus of 50M tables, we apply our tables-to-tasks procedure to produce our resulting dataset, [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full), which comprises 413,299 tasks from 23,744 unique websites.
* [UnpredicTable-unique](https://huggingface.co/datasets/MicPie/unpredictable_unique): This is the same as [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full) but filtered to have a maximum of one task per website. [UnpredicTable-unique](https://huggingface.co/datasets/MicPie/unpredictable_unique) contains exactly 23,744 tasks from 23,744 websites.
* [UnpredicTable-5k](https://huggingface.co/datasets/MicPie/unpredictable_5k): This dataset contains 5k random tables from the full dataset.
* UnpredicTable data subsets based on a manual human quality rating (please see our publication for details of the ratings):
* [UnpredicTable-rated-low](https://huggingface.co/datasets/MicPie/unpredictable_rated-low)
* [UnpredicTable-rated-medium](https://huggingface.co/datasets/MicPie/unpredictable_rated-medium)
* [UnpredicTable-rated-high](https://huggingface.co/datasets/MicPie/unpredictable_rated-high)
* UnpredicTable data subsets based on the website of origin:
* [UnpredicTable-baseball-fantasysports-yahoo-com](https://huggingface.co/datasets/MicPie/unpredictable_baseball-fantasysports-yahoo-com)
* [UnpredicTable-bulbapedia-bulbagarden-net](https://huggingface.co/datasets/MicPie/unpredictable_bulbapedia-bulbagarden-net)
* [UnpredicTable-cappex-com](https://huggingface.co/datasets/MicPie/unpredictable_cappex-com)
* [UnpredicTable-cram-com](https://huggingface.co/datasets/MicPie/unpredictable_cram-com)
* [UnpredicTable-dividend-com](https://huggingface.co/datasets/MicPie/unpredictable_dividend-com)
* [UnpredicTable-dummies-com](https://huggingface.co/datasets/MicPie/unpredictable_dummies-com)
* [UnpredicTable-en-wikipedia-org](https://huggingface.co/datasets/MicPie/unpredictable_en-wikipedia-org)
* [UnpredicTable-ensembl-org](https://huggingface.co/datasets/MicPie/unpredictable_ensembl-org)
* [UnpredicTable-gamefaqs-com](https://huggingface.co/datasets/MicPie/unpredictable_gamefaqs-com)
* [UnpredicTable-mgoblog-com](https://huggingface.co/datasets/MicPie/unpredictable_mgoblog-com)
* [UnpredicTable-mmo-champion-com](https://huggingface.co/datasets/MicPie/unpredictable_mmo-champion-com)
* [UnpredicTable-msdn-microsoft-com](https://huggingface.co/datasets/MicPie/unpredictable_msdn-microsoft-com)
* [UnpredicTable-phonearena-com](https://huggingface.co/datasets/MicPie/unpredictable_phonearena-com)
* [UnpredicTable-sittercity-com](https://huggingface.co/datasets/MicPie/unpredictable_sittercity-com)
* [UnpredicTable-sporcle-com](https://huggingface.co/datasets/MicPie/unpredictable_sporcle-com)
* [UnpredicTable-studystack-com](https://huggingface.co/datasets/MicPie/unpredictable_studystack-com)
* [UnpredicTable-support-google-com](https://huggingface.co/datasets/MicPie/unpredictable_support-google-com)
* [UnpredicTable-w3-org](https://huggingface.co/datasets/MicPie/unpredictable_w3-org)
* [UnpredicTable-wiki-openmoko-org](https://huggingface.co/datasets/MicPie/unpredictable_wiki-openmoko-org)
* [UnpredicTable-wkdu-org](https://huggingface.co/datasets/MicPie/unpredictable_wkdu-org)
* UnpredicTable data subsets based on clustering (for the clustering details please see our publication):
* [UnpredicTable-cluster00](https://huggingface.co/datasets/MicPie/unpredictable_cluster00)
* [UnpredicTable-cluster01](https://huggingface.co/datasets/MicPie/unpredictable_cluster01)
* [UnpredicTable-cluster02](https://huggingface.co/datasets/MicPie/unpredictable_cluster02)
* [UnpredicTable-cluster03](https://huggingface.co/datasets/MicPie/unpredictable_cluster03)
* [UnpredicTable-cluster04](https://huggingface.co/datasets/MicPie/unpredictable_cluster04)
* [UnpredicTable-cluster05](https://huggingface.co/datasets/MicPie/unpredictable_cluster05)
* [UnpredicTable-cluster06](https://huggingface.co/datasets/MicPie/unpredictable_cluster06)
* [UnpredicTable-cluster07](https://huggingface.co/datasets/MicPie/unpredictable_cluster07)
* [UnpredicTable-cluster08](https://huggingface.co/datasets/MicPie/unpredictable_cluster08)
* [UnpredicTable-cluster09](https://huggingface.co/datasets/MicPie/unpredictable_cluster09)
* [UnpredicTable-cluster10](https://huggingface.co/datasets/MicPie/unpredictable_cluster10)
* [UnpredicTable-cluster11](https://huggingface.co/datasets/MicPie/unpredictable_cluster11)
* [UnpredicTable-cluster12](https://huggingface.co/datasets/MicPie/unpredictable_cluster12)
* [UnpredicTable-cluster13](https://huggingface.co/datasets/MicPie/unpredictable_cluster13)
* [UnpredicTable-cluster14](https://huggingface.co/datasets/MicPie/unpredictable_cluster14)
* [UnpredicTable-cluster15](https://huggingface.co/datasets/MicPie/unpredictable_cluster15)
* [UnpredicTable-cluster16](https://huggingface.co/datasets/MicPie/unpredictable_cluster16)
* [UnpredicTable-cluster17](https://huggingface.co/datasets/MicPie/unpredictable_cluster17)
* [UnpredicTable-cluster18](https://huggingface.co/datasets/MicPie/unpredictable_cluster18)
* [UnpredicTable-cluster19](https://huggingface.co/datasets/MicPie/unpredictable_cluster19)
* [UnpredicTable-cluster20](https://huggingface.co/datasets/MicPie/unpredictable_cluster20)
* [UnpredicTable-cluster21](https://huggingface.co/datasets/MicPie/unpredictable_cluster21)
* [UnpredicTable-cluster22](https://huggingface.co/datasets/MicPie/unpredictable_cluster22)
* [UnpredicTable-cluster23](https://huggingface.co/datasets/MicPie/unpredictable_cluster23)
* [UnpredicTable-cluster24](https://huggingface.co/datasets/MicPie/unpredictable_cluster24)
* [UnpredicTable-cluster25](https://huggingface.co/datasets/MicPie/unpredictable_cluster25)
* [UnpredicTable-cluster26](https://huggingface.co/datasets/MicPie/unpredictable_cluster26)
* [UnpredicTable-cluster27](https://huggingface.co/datasets/MicPie/unpredictable_cluster27)
* [UnpredicTable-cluster28](https://huggingface.co/datasets/MicPie/unpredictable_cluster28)
* [UnpredicTable-cluster29](https://huggingface.co/datasets/MicPie/unpredictable_cluster29)
* [UnpredicTable-cluster-noise](https://huggingface.co/datasets/MicPie/unpredictable_cluster-noise)
### Supported Tasks and Leaderboards
Since the tables come from the web, the distribution of tasks and topics is very broad. The shape of our dataset is very wide, i.e., we have 1000's of tasks, while each task has only a few examples, compared to most current NLP datasets which are very deep, i.e., 10s of tasks with many examples. This implies that our dataset covers a broad range of potential tasks, e.g., multiple-choice, question-answering, table-question-answering, text-classification, etc.
The intended use of this dataset is to improve few-shot performance by fine-tuning/pre-training on our dataset.
### Languages
English
## Dataset Structure
### Data Instances
Each task is represented as a jsonline file and consists of several few-shot examples. Each example is a dictionary containing a field 'task', which identifies the task, followed by an 'input', 'options', and 'output' field. The 'input' field contains several column elements of the same row in the table, while the 'output' field is a target which represents an individual column of the same row. Each task contains several such examples which can be concatenated as a few-shot task. In the case of multiple choice classification, the 'options' field contains the possible classes that a model needs to choose from.
There are also additional meta-data fields such as 'pageTitle', 'title', 'outputColName', 'url', 'wdcFile'.
### Data Fields
'task': task identifier
'input': column elements of a specific row in the table.
'options': for multiple choice classification, it provides the options to choose from.
'output': target column element of the same row as input.
'pageTitle': the title of the page containing the table.
'outputColName': output column name
'url': url to the website containing the table
'wdcFile': WDC Web Table Corpus file
### Data Splits
The UnpredicTable datasets do not come with additional data splits.
## Dataset Creation
### Curation Rationale
Few-shot training on multi-task datasets has been demonstrated to improve language models' few-shot learning (FSL) performance on new tasks, but it is unclear which training tasks lead to effective downstream task adaptation. Few-shot learning datasets are typically produced with expensive human curation, limiting the scale and diversity of the training tasks available to study. As an alternative source of few-shot data, we automatically extract 413,299 tasks from diverse internet tables. We provide this as a research resource to investigate the relationship between training data and few-shot learning.
### Source Data
#### Initial Data Collection and Normalization
We use internet tables from the English-language Relational Subset of the WDC Web Table Corpus 2015 (WTC). The WTC dataset tables were extracted from the July 2015 Common Crawl web corpus (http://webdatacommons.org/webtables/2015/EnglishStatistics.html). The dataset contains 50,820,165 tables from 323,160 web domains. We then convert the tables into few-shot learning tasks. Please see our publication for more details on the data collection and conversion pipeline.
#### Who are the source language producers?
The dataset is extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/).
### Annotations
#### Annotation process
Manual annotation was only carried out for the [UnpredicTable-rated-low](https://huggingface.co/datasets/MicPie/unpredictable_rated-low),
[UnpredicTable-rated-medium](https://huggingface.co/datasets/MicPie/unpredictable_rated-medium), and [UnpredicTable-rated-high](https://huggingface.co/datasets/MicPie/unpredictable_rated-high) data subsets to rate task quality. Detailed instructions of the annotation instructions can be found in our publication.
#### Who are the annotators?
Annotations were carried out by a lab assistant.
### Personal and Sensitive Information
The data was extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/), which in turn extracted tables from the [Common Crawl](https://commoncrawl.org/). We did not filter the data in any way. Thus any user identities or otherwise sensitive information (e.g., data that reveals racial or ethnic origins, sexual orientations, religious beliefs, political opinions or union memberships, or locations; financial or health data; biometric or genetic data; forms of government identification, such as social security numbers; criminal history, etc.) might be contained in our dataset.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset is intended for use as a research resource to investigate the relationship between training data and few-shot learning. As such, it contains high- and low-quality data, as well as diverse content that may be untruthful or inappropriate. Without careful investigation, it should not be used for training models that will be deployed for use in decision-critical or user-facing situations.
### Discussion of Biases
Since our dataset contains tables that are scraped from the web, it will also contain many toxic, racist, sexist, and otherwise harmful biases and texts. We have not run any analysis on the biases prevalent in our datasets. Neither have we explicitly filtered the content. This implies that a model trained on our dataset may potentially reflect harmful biases and toxic text that exist in our dataset.
### Other Known Limitations
No additional known limitations.
## Additional Information
### Dataset Curators
Jun Shern Chan, Michael Pieler, Jonathan Jao, Jérémy Scheurer, Ethan Perez
### Licensing Information
Apache 2.0
### Citation Information
```
@misc{chan2022few,
author = {Chan, Jun Shern and Pieler, Michael and Jao, Jonathan and Scheurer, Jérémy and Perez, Ethan},
title = {Few-shot Adaptation Works with UnpredicTable Data},
publisher={arXiv},
year = {2022},
url = {https://arxiv.org/abs/2208.01009}
}
```
|
open-llm-leaderboard/details_cgato__Thespis-Krangled-7b-v2 | ---
pretty_name: Evaluation run of cgato/Thespis-Krangled-7b-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cgato/Thespis-Krangled-7b-v2](https://huggingface.co/cgato/Thespis-Krangled-7b-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cgato__Thespis-Krangled-7b-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-15T02:30:42.449259](https://huggingface.co/datasets/open-llm-leaderboard/details_cgato__Thespis-Krangled-7b-v2/blob/main/results_2024-03-15T02-30-42.449259.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.623157694126533,\n\
\ \"acc_stderr\": 0.032761952127641665,\n \"acc_norm\": 0.6279941253012823,\n\
\ \"acc_norm_stderr\": 0.03342320585396732,\n \"mc1\": 0.37209302325581395,\n\
\ \"mc1_stderr\": 0.016921090118814035,\n \"mc2\": 0.5302234821166525,\n\
\ \"mc2_stderr\": 0.015157977440178593\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5972696245733788,\n \"acc_stderr\": 0.014332236306790147,\n\
\ \"acc_norm\": 0.628839590443686,\n \"acc_norm_stderr\": 0.01411797190114282\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6307508464449313,\n\
\ \"acc_stderr\": 0.004816152074023085,\n \"acc_norm\": 0.8304122684724159,\n\
\ \"acc_norm_stderr\": 0.003745032667228281\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.039105257528497236,\n\
\ \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.039105257528497236\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.0373362665538351,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.0373362665538351\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.0470070803355104,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.0470070803355104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.025305906241590632,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.025305906241590632\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7161290322580646,\n\
\ \"acc_stderr\": 0.025649381063029265,\n \"acc_norm\": 0.7161290322580646,\n\
\ \"acc_norm_stderr\": 0.025649381063029265\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.617948717948718,\n \"acc_stderr\": 0.024635549163908237,\n \
\ \"acc_norm\": 0.617948717948718,\n \"acc_norm_stderr\": 0.024635549163908237\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608463,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608463\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \
\ \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659807,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659807\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7944954128440367,\n \"acc_stderr\": 0.01732435232501601,\n \"\
acc_norm\": 0.7944954128440367,\n \"acc_norm_stderr\": 0.01732435232501601\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977747,\n \"\
acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977747\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229146,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229146\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n\
\ \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n\
\ \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n\
\ \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2916201117318436,\n\
\ \"acc_stderr\": 0.01520103251252044,\n \"acc_norm\": 0.2916201117318436,\n\
\ \"acc_norm_stderr\": 0.01520103251252044\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.025583062489984806,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.025583062489984806\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765134,\n\
\ \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765134\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44328552803129073,\n\
\ \"acc_stderr\": 0.012687818419599919,\n \"acc_norm\": 0.44328552803129073,\n\
\ \"acc_norm_stderr\": 0.012687818419599919\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n\
\ \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6454248366013072,\n \"acc_stderr\": 0.019353360547553697,\n \
\ \"acc_norm\": 0.6454248366013072,\n \"acc_norm_stderr\": 0.019353360547553697\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712844,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712844\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n\
\ \"acc_stderr\": 0.027962677604768914,\n \"acc_norm\": 0.8059701492537313,\n\
\ \"acc_norm_stderr\": 0.027962677604768914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368036,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368036\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37209302325581395,\n\
\ \"mc1_stderr\": 0.016921090118814035,\n \"mc2\": 0.5302234821166525,\n\
\ \"mc2_stderr\": 0.015157977440178593\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7790055248618785,\n \"acc_stderr\": 0.01166122363764341\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4131918119787718,\n \
\ \"acc_stderr\": 0.013563326951984367\n }\n}\n```"
repo_url: https://huggingface.co/cgato/Thespis-Krangled-7b-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|arc:challenge|25_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|gsm8k|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hellaswag|10_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T02-30-42.449259.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-15T02-30-42.449259.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- '**/details_harness|winogrande|5_2024-03-15T02-30-42.449259.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-15T02-30-42.449259.parquet'
- config_name: results
data_files:
- split: 2024_03_15T02_30_42.449259
path:
- results_2024-03-15T02-30-42.449259.parquet
- split: latest
path:
- results_2024-03-15T02-30-42.449259.parquet
---
# Dataset Card for Evaluation run of cgato/Thespis-Krangled-7b-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cgato/Thespis-Krangled-7b-v2](https://huggingface.co/cgato/Thespis-Krangled-7b-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cgato__Thespis-Krangled-7b-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-15T02:30:42.449259](https://huggingface.co/datasets/open-llm-leaderboard/details_cgato__Thespis-Krangled-7b-v2/blob/main/results_2024-03-15T02-30-42.449259.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.623157694126533,
"acc_stderr": 0.032761952127641665,
"acc_norm": 0.6279941253012823,
"acc_norm_stderr": 0.03342320585396732,
"mc1": 0.37209302325581395,
"mc1_stderr": 0.016921090118814035,
"mc2": 0.5302234821166525,
"mc2_stderr": 0.015157977440178593
},
"harness|arc:challenge|25": {
"acc": 0.5972696245733788,
"acc_stderr": 0.014332236306790147,
"acc_norm": 0.628839590443686,
"acc_norm_stderr": 0.01411797190114282
},
"harness|hellaswag|10": {
"acc": 0.6307508464449313,
"acc_stderr": 0.004816152074023085,
"acc_norm": 0.8304122684724159,
"acc_norm_stderr": 0.003745032667228281
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.039105257528497236,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.039105257528497236
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.0373362665538351,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.0373362665538351
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.0470070803355104,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.0470070803355104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.025305906241590632,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.025305906241590632
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7161290322580646,
"acc_stderr": 0.025649381063029265,
"acc_norm": 0.7161290322580646,
"acc_norm_stderr": 0.025649381063029265
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.617948717948718,
"acc_stderr": 0.024635549163908237,
"acc_norm": 0.617948717948718,
"acc_norm_stderr": 0.024635549163908237
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608463,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608463
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659807,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659807
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7944954128440367,
"acc_stderr": 0.01732435232501601,
"acc_norm": 0.7944954128440367,
"acc_norm_stderr": 0.01732435232501601
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977747,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977747
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229146,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229146
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2916201117318436,
"acc_stderr": 0.01520103251252044,
"acc_norm": 0.2916201117318436,
"acc_norm_stderr": 0.01520103251252044
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984806,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984806
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.025089478523765134,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.025089478523765134
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.029736592526424438,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.029736592526424438
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44328552803129073,
"acc_stderr": 0.012687818419599919,
"acc_norm": 0.44328552803129073,
"acc_norm_stderr": 0.012687818419599919
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.029163128570670733,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.029163128570670733
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6454248366013072,
"acc_stderr": 0.019353360547553697,
"acc_norm": 0.6454248366013072,
"acc_norm_stderr": 0.019353360547553697
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.02853556033712844,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.02853556033712844
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768914,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368036,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368036
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37209302325581395,
"mc1_stderr": 0.016921090118814035,
"mc2": 0.5302234821166525,
"mc2_stderr": 0.015157977440178593
},
"harness|winogrande|5": {
"acc": 0.7790055248618785,
"acc_stderr": 0.01166122363764341
},
"harness|gsm8k|5": {
"acc": 0.4131918119787718,
"acc_stderr": 0.013563326951984367
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
YutoNishimura-v2/text-to-kanji-v3 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 30215498.53
num_examples: 6410
download_size: 33222785
dataset_size: 30215498.53
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gabrielmbmb/ultrafeedback-prompts-judgelm-gpt35 | ---
dataset_info:
features:
- name: input
dtype: string
- name: generation_model
dtype: string
- name: generation_prompt
dtype: string
- name: raw_generation_responses
sequence: string
- name: generations
sequence: string
- name: labelling_model
dtype: string
- name: labelling_prompt
list:
- name: content
dtype: string
- name: role
dtype: string
- name: raw_labelling_response
dtype: string
- name: ratings
sequence: int64
- name: rationale
dtype: string
splits:
- name: train
num_bytes: 13479349
num_examples: 1000
download_size: 6250632
dataset_size: 13479349
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ultrafeedback-prompts-judgelm-gpt35"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ssuengpp/test2_0226 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 8609964
num_examples: 6374
download_size: 5007145
dataset_size: 8609964
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
DanielDimas/123 | ---
license: openrail
---
|
bofenghuang/mt-bench-french | ---
license: apache-2.0
task_categories:
- question-answering
language:
- fr
tags:
- evaluation
pretty_name: MT-Bench-French
size_categories:
- n<1K
configs:
- config_name: default
data_files:
- split: test
path: "question.jsonl"
---
# MT-Bench-French
This is a French version of [MT-Bench](https://arxiv.org/abs/2306.05685), created to evaluate the multi-turn conversation and instruction-following capabilities of LLMs. Similar to its original version, MT-Bench-French comprises 80 high-quality, multi-turn questions spanning eight main categories.
All questions have undergone translation into French and **thorough human review** to guarantee the use of suitable and authentic wording, meaningful content for assessing LLMs' capabilities in the French language, and coherence between questions within the same conversation.
For certain challenging tasks (e.g., math, reasoning, and coding), a reference answer is included in the judge prompt to assist in evaluating responses from LLMs, referred to as a *reference-guided judge*. Notably, these reference answers are also generated by the LLM judge (GPT-4). In our version, we took an extra step of reviewing and correcting these reference answers by human. This was done to address several concerns: 1) GPT-4 exhibited a decline in performance when transitioning from English to French. The responses generated for complex tasks did not meet the required standards to function as reference answers. 2) Human-corrected reference answer helps mitigate the bias in evaluating LLMs. However, it's important to note that some degree of bias still persists.
*Please note that although this dataset provides a convenient way to evaluate LLMs, it shouldn't be regarded as the ultimate benchmark for such assessments, given the inherent limitations of both the dataset and the methodology.*
## News
- [2024/03/14]: Added `claude-3-haiku-20240307`, `claude-3-sonnet-20240229`, `claude-3-opus-20240229`, and `c4ai-command-r-v01`
- [2024/02/26]: Added `mistral-small-2402`, `mistral-large-2402`, and `gpt-4-0125-preview`
- [2024/01/26]: Added `mistral-small-2312` with thanks to @thomlevy
## Evaluation
*Last updated on Mar 14, 2024*
```
########## First turn ##########
score
model turn
gpt-4-0125-preview 1 9.350000
gpt-4-1106-preview 1 9.343750
claude-3-opus-20240229 1 9.056250
mistral-large-2402 1 9.006250
gpt-4-0314 1 8.987500
mistral-small-2402 1 8.493750
claude-3-sonnet-20240229 1 8.462500
mistral-medium-2312 1 8.412500
gpt-3.5-turbo-0613 1 8.387500
claude-3-haiku-20240307 1 8.237500
mistral-small-2312 1 8.156250
c4ai-command-r-v01 1 7.431250
vigogne-2-70b-chat 1 7.381250
gemini-pro 1 7.194805
########## Second turn ##########
score
model turn
gpt-4-0125-preview 2 9.050000
gpt-4-1106-preview 2 9.050000
claude-3-opus-20240229 2 8.812500
gpt-4-0314 2 8.656250
mistral-large-2402 2 8.437500
claude-3-sonnet-20240229 2 8.137500
mistral-medium-2312 2 8.037500
mistral-small-2402 2 8.025000
claude-3-haiku-20240307 2 7.812500
gpt-3.5-turbo-0613 2 7.612500
mistral-small-2312 2 7.562500
gemini-pro 2 7.545455
c4ai-command-r-v01 2 7.143750
vigogne-2-70b-chat 2 7.075000
########## Average ##########
score
model
gpt-4-0125-preview 9.200000
gpt-4-1106-preview 9.196875
claude-3-opus-20240229 8.934375
gpt-4-0314 8.821875
mistral-large-2402 8.721875
claude-3-sonnet-20240229 8.300000
mistral-small-2402 8.259375
mistral-medium-2312 8.225000
claude-3-haiku-20240307 8.025000
gpt-3.5-turbo-0613 8.000000
mistral-small-2312 7.859375
gemini-pro 7.370130
c4ai-command-r-v01 7.287500
vigogne-2-70b-chat 7.228125
```
## Examples
Here are a few examples to highlight the distinction:
#### Choosing appropriate and authentic wording
*Original question:*
```
Given the following data, identify the company with the highest profit in 2021 and provide its CEO's name:
...
Which company had the highest profit margin (profit/revenue ratio)?
```
*Translated question:*
```
Étant donné les informations suivantes, identifie le nom de l'entreprise qui a réalisé le plus gros bénéfice en 2021 et fournis le nom de son PDG :
...
Quelle entreprise avait la marge bénéficiaire la plus élevée (rapport bénéfice/chiffre d'affaires) ?
```
Certain translators translate "profit/revenue ratio" as "rapport bénéfice/revenu", but the accurate translation should be "rapport bénéfice/chiffre d'affaires".
#### Following original question format
*Original question:*
```
Can you change the ratings from numbers to letters? Capital letters MUST be used when writing the names of phones.
```
*Translated question:*
```
Pouvez-vous changer les notes de chiffres en lettres ? Les noms des téléphones doivent être écrits IMPÉRATIVEMENT en lettres majuscules.
```
We maintain the original question's format, highlighting "MUST" in uppercase ("IMPÉRATIVEMENT" in French) to grab the attention of the language model. Additionally, we uphold other formats, including indentation and line breaks, in the translated version.
#### Avoiding unnecessary translation of Anglicisms
*Original question:*
```
A tech startup invests $8000 in software development in the first year...
```
*Translated question:*
```
Une startup technologique investit 8000 euros dans le développement de logiciels la première année...
```
Some English terms were kept as-is, as they are commonly used in French.
#### Mixing formal and informal pronouns for diversity
*Translated question 1:*
```
Veuillez assumer le rôle d'un coach relationnel. Vous recevrez des détails sur deux personnes en conflit, et votre tâche sera de proposer des suggestions pour résoudre leurs problèmes et combler le fossé entre eux.
```
*Translated question 2:*
```
Crée un plan de leçon intégrant des techniques de théâtre
```
*Translated question 3:*
```
Est-ce que tu aimes danser ? Peux-tu m'apprendre ?
```
#### Ensuring meaningfulness in the translated questions
*Original question:*
```
Edit the following paragraph to correct any grammatical errors:
She didn't remembre where is her purse, so I thinks its in the car but he's say it's on kitchen table but he are not sure, and then they asked me to looking for it, she's say, "Can you?", and I responds with, "Maybe, but ain't no sure," and he not heard me, and, "What?", he asks, "Did you found it?".
```
*Translated question:*
```
Editez le paragraphe suivant pour corriger toute erreur grammaticale :
Elle ne se souvenaeint pas où été son sac à main, donc je penses qu'il est dans le voiture, mais il à dis qu'il est sur table du cuisine, bien qu'il n'en soient pas sûre. Ensuite, ils m'ont demandé de le cherchez. "Tu peut ?", elle a demandée, et j'ai répond, "Peut être, mais ne suis pas sûr." Il ne ma entendu, et il a demander "Quoi ? Tu l'a trouvés ?"
```
Some translators might rectify grammatical errors in the sentence. In contrast, we translated it and purposely introduced certain common errors in French.
#### Guaranteeing the translated questions are suitable for evaluating LLMs in French
*Original question:*
```
Please assume the role of an English translator, tasked with correcting and enhancing spelling and language. Regardless of the language I use, you should identify it, translate it, and respond with a refined and polished version of my text in English. Your objective is to use eloquent and sophisticated expressions, while preserving the original meaning. Focus solely on providing corrections and improvements. My first request is "衣带渐宽终不悔 为伊消得人憔悴".
```
*Translated question:*
```
Joue le rôle d'un traducteur francophone que l'on a chargé de corriger et d'embellir l'orthographe et l'expression de mon travail. Indépendamment de la langue utilisée, identifie-la, traduis-la et produis une version française plus raffinée de mon texte. Ton but est d'employer des expressions éloquentes et sophistiquées tout en préservant le sens original. Contente-toi de fournir des corrections et des améliorations. Ma première requête est la suivante : "衣带渐宽终不悔 为伊消得人憔悴".
```
Given that we are evaluating LLMs for the French language, we request the model to translate a sentence into French instead of English.
#### Miscellaneous
*Original question:*
```
"Compose an engaging travel blog post about a recent trip to Hawaii, highlighting cultural experiences and must-see attractions.
```
*Translated question:*
```
Rédigez un blog de voyage captivant sur un voyage récent en Corse, en mettant en évidence les expériences culturelles et les attractions incontournables.
```
We replaced the destination 'Hawaii' with 'Corse' since it is more aligned with French culture, along with other changes like substituting "dollar" with "euro".
## How to evaluate custom models
Please refer to the [instructions](https://github.com/lm-sys/FastChat/tree/main/fastchat/llm_judge#mt-bench) of LMSYS for guidance on evaluating custom models.
## Limitations
This dataset serves the purpose of efficiently evaluating the performance of LLMs in the French language. However, it's important to acknowledge its limitations, which include:
- GPT-4's inherent bias in assessing LLM responses, particularly in aspects like position, verbosity, and self-enhancement. Additionally, it exhibits limitations in effectively grading challenging tasks, such as those requiring advanced mathematical and reasoning capabilities.
- The dataset encompasses only 10 tasks (20 questions) per category, which might not provide a comprehensive representation of the full capabilities of LLMs.
- A majority of the questions were translated from their original English version, potentially impacting the dataset's ability to accurately reflect performance in the context of French culture.
## Acknowledgment
- [LMSYS](https://lmsys.org) for creating the original dataset
- [Audrey Cornu](https://www.linkedin.com/in/audrey-cornu-0b9808142), [Tiphaine Fievet](https://www.linkedin.com/in/tiphaine-fievet-84b3431b8), [Amira Guesmi](https://www.linkedin.com/in/amira-guesmi-4a435684), [Cindy Perrigault](https://www.linkedin.com/in/cindy-perrigault), [Hayoung Seo](https://www.linkedin.com/in/hayoung-seo-180b26200) (in alphabetical order), and myself for the translation and careful review
|
hoangphu7122002ai/instruct_general_coreference_resolution | ---
dataset_info:
- config_name: 0-10000
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: gen
dtype: string
- name: len
dtype: int64
- name: SCRIPT
dtype: string
splits:
- name: train
num_bytes: 87826965
num_examples: 10000
download_size: 45185857
dataset_size: 87826965
- config_name: 10000-20000
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: gen
dtype: string
- name: len
dtype: int64
- name: SCRIPT
dtype: string
splits:
- name: train
num_bytes: 88294476
num_examples: 10000
download_size: 45448731
dataset_size: 88294476
- config_name: 20000-30000
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: gen
dtype: string
- name: len
dtype: int64
- name: SCRIPT
dtype: string
splits:
- name: train
num_bytes: 88179711
num_examples: 10000
download_size: 45425460
dataset_size: 88179711
- config_name: 30000-40000
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: gen
dtype: string
- name: len
dtype: int64
- name: SCRIPT
dtype: string
splits:
- name: train
num_bytes: 88187444
num_examples: 10000
download_size: 45344066
dataset_size: 88187444
- config_name: 40000-50000
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: gen
dtype: string
- name: len
dtype: int64
- name: SCRIPT
dtype: string
splits:
- name: train
num_bytes: 88314094
num_examples: 10000
download_size: 45405726
dataset_size: 88314094
- config_name: 50000-60000
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: gen
dtype: string
- name: len
dtype: int64
- name: SCRIPT
dtype: string
splits:
- name: train
num_bytes: 87439746
num_examples: 10000
download_size: 44864323
dataset_size: 87439746
- config_name: default
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: gen
dtype: string
- name: len
dtype: int64
- name: SCRIPT
dtype: string
splits:
- name: train
num_bytes: 39833704
num_examples: 10000
download_size: 21237984
dataset_size: 39833704
configs:
- config_name: 0-10000
data_files:
- split: train
path: data/0-10000/train-*
- config_name: 10000-20000
data_files:
- split: train
path: data/10000-20000/train-*
- config_name: 20000-30000
data_files:
- split: train
path: data/20000-30000/train-*
- config_name: 30000-40000
data_files:
- split: train
path: data/30000-40000/train-*
- config_name: 40000-50000
data_files:
- split: train
path: data/40000-50000/train-*
- config_name: 50000-60000
data_files:
- split: train
path: data/50000-60000/train-*
- config_name: default
data_files:
- split: train
path: data/50000-60000/train-*
---
|
paul-w-qs/contracts_v5 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: N_ROWS
dtype: int64
- name: N_COLS
dtype: int64
- name: FONT_SIZE
dtype: int64
- name: FONT_NAME
dtype: string
- name: BORDER_THICKNESS
dtype: int64
- name: TABLE_STYLE
dtype: string
- name: NOISED
dtype: bool
- name: LABEL_NOISE
dtype: bool
- name: JSON_LABEL
dtype: string
splits:
- name: train
num_bytes: 849678199.916
num_examples: 11316
download_size: 819599314
dataset_size: 849678199.916
---
# Dataset Card for "contracts_v5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JonasLange/DBD-research-group | ---
dataset_info:
features:
- name: ID
dtype: string
- name: Audio
dtype: audio
- name: Start Time (s)
dtype: string
- name: End Time (s)
dtype: string
- name: Low Freq (Hz)
dtype: string
- name: High Freq (Hz)
dtype: string
- name: Species eBird Code
dtype: string
- name: Call Type
dtype: 'null'
- name: Sex
dtype: 'null'
- name: Latitude
dtype: float64
- name: Longitude
dtype: float64
- name: Uncertainty
dtype: 'null'
- name: Microphone
dtype: string
- name: License
dtype: string
- name: Source
dtype: string
- name: BirdNet Training Data
dtype: bool
splits:
- name: train
num_bytes: 156231869250.216
num_examples: 10976
download_size: 538872707
dataset_size: 156231869250.216
---
# Dataset Card for "DBD-research-group"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
marmofayezi/M3GenMaskPrompt | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: mask
dtype: image
- name: caption
dtype: string
- name: generated_image
dtype: image
splits:
- name: train
num_bytes: 2255985556.75
num_examples: 2998
download_size: 1887856205
dataset_size: 2255985556.75
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_mncai__Mistral-7B-openplatypus-1k | ---
pretty_name: Evaluation run of mncai/Mistral-7B-openplatypus-1k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mncai/Mistral-7B-openplatypus-1k](https://huggingface.co/mncai/Mistral-7B-openplatypus-1k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mncai__Mistral-7B-openplatypus-1k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-27T04:31:44.728538](https://huggingface.co/datasets/open-llm-leaderboard/details_mncai__Mistral-7B-openplatypus-1k/blob/main/results_2023-10-27T04-31-44.728538.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0019924496644295304,\n\
\ \"em_stderr\": 0.00045666764626669425,\n \"f1\": 0.06536912751677865,\n\
\ \"f1_stderr\": 0.001427220169024926,\n \"acc\": 0.47155979662189373,\n\
\ \"acc_stderr\": 0.01115073074341337\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.00045666764626669425,\n\
\ \"f1\": 0.06536912751677865,\n \"f1_stderr\": 0.001427220169024926\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.17437452615617893,\n \
\ \"acc_stderr\": 0.010451421361976233\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7687450670876085,\n \"acc_stderr\": 0.01185004012485051\n\
\ }\n}\n```"
repo_url: https://huggingface.co/mncai/Mistral-7B-openplatypus-1k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|arc:challenge|25_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_27T04_31_44.728538
path:
- '**/details_harness|drop|3_2023-10-27T04-31-44.728538.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-27T04-31-44.728538.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_27T04_31_44.728538
path:
- '**/details_harness|gsm8k|5_2023-10-27T04-31-44.728538.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-27T04-31-44.728538.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hellaswag|10_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_27T04_31_44.728538
path:
- '**/details_harness|winogrande|5_2023-10-27T04-31-44.728538.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-27T04-31-44.728538.parquet'
- config_name: results
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- results_2023-10-10T11-26-36.133476.parquet
- split: 2023_10_27T04_31_44.728538
path:
- results_2023-10-27T04-31-44.728538.parquet
- split: latest
path:
- results_2023-10-27T04-31-44.728538.parquet
---
# Dataset Card for Evaluation run of mncai/Mistral-7B-openplatypus-1k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/mncai/Mistral-7B-openplatypus-1k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [mncai/Mistral-7B-openplatypus-1k](https://huggingface.co/mncai/Mistral-7B-openplatypus-1k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mncai__Mistral-7B-openplatypus-1k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-27T04:31:44.728538](https://huggingface.co/datasets/open-llm-leaderboard/details_mncai__Mistral-7B-openplatypus-1k/blob/main/results_2023-10-27T04-31-44.728538.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0019924496644295304,
"em_stderr": 0.00045666764626669425,
"f1": 0.06536912751677865,
"f1_stderr": 0.001427220169024926,
"acc": 0.47155979662189373,
"acc_stderr": 0.01115073074341337
},
"harness|drop|3": {
"em": 0.0019924496644295304,
"em_stderr": 0.00045666764626669425,
"f1": 0.06536912751677865,
"f1_stderr": 0.001427220169024926
},
"harness|gsm8k|5": {
"acc": 0.17437452615617893,
"acc_stderr": 0.010451421361976233
},
"harness|winogrande|5": {
"acc": 0.7687450670876085,
"acc_stderr": 0.01185004012485051
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ebony59/AO3_fandom_chatbot | ---
dataset_info:
features:
- name: conversation
list:
- name: content
dtype: string
- name: do_train
dtype: bool
- name: role
dtype: string
splits:
- name: train
num_bytes: 2269557
num_examples: 1036
download_size: 1161469
dataset_size: 2269557
---
# Dataset Card for "AO3_fandom_chatbot"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Rosenberg/CMeIE | ---
license: mit
---
|
open-llm-leaderboard/details_h2oai__h2o-danube2-1.8b-base | ---
pretty_name: Evaluation run of h2oai/h2o-danube2-1.8b-base
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [h2oai/h2o-danube2-1.8b-base](https://huggingface.co/h2oai/h2o-danube2-1.8b-base)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_h2oai__h2o-danube2-1.8b-base\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-05T14:43:45.005575](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2o-danube2-1.8b-base/blob/main/results_2024-04-05T14-43-45.005575.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.407096217325087,\n\
\ \"acc_stderr\": 0.03416458904719109,\n \"acc_norm\": 0.4080909002245428,\n\
\ \"acc_norm_stderr\": 0.03488085494267354,\n \"mc1\": 0.25458996328029376,\n\
\ \"mc1_stderr\": 0.015250117079156496,\n \"mc2\": 0.3801259785228798,\n\
\ \"mc2_stderr\": 0.014088609967255201\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3984641638225256,\n \"acc_stderr\": 0.014306946052735562,\n\
\ \"acc_norm\": 0.4334470989761092,\n \"acc_norm_stderr\": 0.014481376224558896\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5417247560246963,\n\
\ \"acc_stderr\": 0.004972377085916327,\n \"acc_norm\": 0.7295359490141406,\n\
\ \"acc_norm_stderr\": 0.004432917403755053\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3618421052631579,\n \"acc_stderr\": 0.03910525752849724,\n\
\ \"acc_norm\": 0.3618421052631579,\n \"acc_norm_stderr\": 0.03910525752849724\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.39,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4188679245283019,\n \"acc_stderr\": 0.030365050829115205,\n\
\ \"acc_norm\": 0.4188679245283019,\n \"acc_norm_stderr\": 0.030365050829115205\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3611111111111111,\n\
\ \"acc_stderr\": 0.04016660030451233,\n \"acc_norm\": 0.3611111111111111,\n\
\ \"acc_norm_stderr\": 0.04016660030451233\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3872832369942196,\n\
\ \"acc_stderr\": 0.037143259063020656,\n \"acc_norm\": 0.3872832369942196,\n\
\ \"acc_norm_stderr\": 0.037143259063020656\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179962,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179962\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.35319148936170214,\n \"acc_stderr\": 0.031245325202761926,\n\
\ \"acc_norm\": 0.35319148936170214,\n \"acc_norm_stderr\": 0.031245325202761926\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523864,\n \"\
acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523864\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.03809523809523811,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.03809523809523811\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.38387096774193546,\n\
\ \"acc_stderr\": 0.027666182075539635,\n \"acc_norm\": 0.38387096774193546,\n\
\ \"acc_norm_stderr\": 0.027666182075539635\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.31527093596059114,\n \"acc_stderr\": 0.03269080871970186,\n\
\ \"acc_norm\": 0.31527093596059114,\n \"acc_norm_stderr\": 0.03269080871970186\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.47878787878787876,\n \"acc_stderr\": 0.03900828913737302,\n\
\ \"acc_norm\": 0.47878787878787876,\n \"acc_norm_stderr\": 0.03900828913737302\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5050505050505051,\n \"acc_stderr\": 0.035621707606254015,\n \"\
acc_norm\": 0.5050505050505051,\n \"acc_norm_stderr\": 0.035621707606254015\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5233160621761658,\n \"acc_stderr\": 0.03604513672442202,\n\
\ \"acc_norm\": 0.5233160621761658,\n \"acc_norm_stderr\": 0.03604513672442202\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.38461538461538464,\n \"acc_stderr\": 0.024666744915187222,\n\
\ \"acc_norm\": 0.38461538461538464,\n \"acc_norm_stderr\": 0.024666744915187222\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3487394957983193,\n \"acc_stderr\": 0.03095663632856655,\n \
\ \"acc_norm\": 0.3487394957983193,\n \"acc_norm_stderr\": 0.03095663632856655\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2119205298013245,\n \"acc_stderr\": 0.03336767086567977,\n \"\
acc_norm\": 0.2119205298013245,\n \"acc_norm_stderr\": 0.03336767086567977\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.46605504587155966,\n \"acc_stderr\": 0.02138786335035399,\n \"\
acc_norm\": 0.46605504587155966,\n \"acc_norm_stderr\": 0.02138786335035399\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2916666666666667,\n \"acc_stderr\": 0.030998666304560524,\n \"\
acc_norm\": 0.2916666666666667,\n \"acc_norm_stderr\": 0.030998666304560524\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.45098039215686275,\n \"acc_stderr\": 0.03492406104163613,\n \"\
acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.03492406104163613\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5611814345991561,\n \"acc_stderr\": 0.032302649315470375,\n \
\ \"acc_norm\": 0.5611814345991561,\n \"acc_norm_stderr\": 0.032302649315470375\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5022421524663677,\n\
\ \"acc_stderr\": 0.03355746535223263,\n \"acc_norm\": 0.5022421524663677,\n\
\ \"acc_norm_stderr\": 0.03355746535223263\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4961832061068702,\n \"acc_stderr\": 0.04385162325601553,\n\
\ \"acc_norm\": 0.4961832061068702,\n \"acc_norm_stderr\": 0.04385162325601553\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5537190082644629,\n \"acc_stderr\": 0.0453793517794788,\n \"acc_norm\"\
: 0.5537190082644629,\n \"acc_norm_stderr\": 0.0453793517794788\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.04833682445228318,\n \"acc_norm\": 0.5,\n \"\
acc_norm_stderr\": 0.04833682445228318\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4049079754601227,\n \"acc_stderr\": 0.03856672163548913,\n\
\ \"acc_norm\": 0.4049079754601227,\n \"acc_norm_stderr\": 0.03856672163548913\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.043270409325787296,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.043270409325787296\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5048543689320388,\n \"acc_stderr\": 0.04950504382128921,\n\
\ \"acc_norm\": 0.5048543689320388,\n \"acc_norm_stderr\": 0.04950504382128921\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.03193705726200293,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.03193705726200293\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5772669220945083,\n\
\ \"acc_stderr\": 0.017665180351954062,\n \"acc_norm\": 0.5772669220945083,\n\
\ \"acc_norm_stderr\": 0.017665180351954062\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4479768786127168,\n \"acc_stderr\": 0.02677299065336182,\n\
\ \"acc_norm\": 0.4479768786127168,\n \"acc_norm_stderr\": 0.02677299065336182\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27150837988826815,\n\
\ \"acc_stderr\": 0.014874252168095277,\n \"acc_norm\": 0.27150837988826815,\n\
\ \"acc_norm_stderr\": 0.014874252168095277\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.028491993586171566,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.028491993586171566\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4533762057877814,\n\
\ \"acc_stderr\": 0.02827435985489425,\n \"acc_norm\": 0.4533762057877814,\n\
\ \"acc_norm_stderr\": 0.02827435985489425\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.44753086419753085,\n \"acc_stderr\": 0.027667138569422697,\n\
\ \"acc_norm\": 0.44753086419753085,\n \"acc_norm_stderr\": 0.027667138569422697\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3049645390070922,\n \"acc_stderr\": 0.02746470844202213,\n \
\ \"acc_norm\": 0.3049645390070922,\n \"acc_norm_stderr\": 0.02746470844202213\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3396349413298566,\n\
\ \"acc_stderr\": 0.012095592506931974,\n \"acc_norm\": 0.3396349413298566,\n\
\ \"acc_norm_stderr\": 0.012095592506931974\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.27205882352941174,\n \"acc_stderr\": 0.027033041151681456,\n\
\ \"acc_norm\": 0.27205882352941174,\n \"acc_norm_stderr\": 0.027033041151681456\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.39869281045751637,\n \"acc_stderr\": 0.01980828131744985,\n \
\ \"acc_norm\": 0.39869281045751637,\n \"acc_norm_stderr\": 0.01980828131744985\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4636363636363636,\n\
\ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.4636363636363636,\n\
\ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.30612244897959184,\n \"acc_stderr\": 0.029504896454595957,\n\
\ \"acc_norm\": 0.30612244897959184,\n \"acc_norm_stderr\": 0.029504896454595957\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4925373134328358,\n\
\ \"acc_stderr\": 0.03535140084276719,\n \"acc_norm\": 0.4925373134328358,\n\
\ \"acc_norm_stderr\": 0.03535140084276719\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3855421686746988,\n\
\ \"acc_stderr\": 0.03789134424611549,\n \"acc_norm\": 0.3855421686746988,\n\
\ \"acc_norm_stderr\": 0.03789134424611549\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5321637426900585,\n \"acc_stderr\": 0.038268824176603704,\n\
\ \"acc_norm\": 0.5321637426900585,\n \"acc_norm_stderr\": 0.038268824176603704\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25458996328029376,\n\
\ \"mc1_stderr\": 0.015250117079156496,\n \"mc2\": 0.3801259785228798,\n\
\ \"mc2_stderr\": 0.014088609967255201\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6803472770323599,\n \"acc_stderr\": 0.013106528517665137\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2979529946929492,\n \
\ \"acc_stderr\": 0.012597932232914513\n }\n}\n```"
repo_url: https://huggingface.co/h2oai/h2o-danube2-1.8b-base
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|arc:challenge|25_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|gsm8k|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hellaswag|10_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T14-43-45.005575.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-05T14-43-45.005575.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- '**/details_harness|winogrande|5_2024-04-05T14-43-45.005575.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-05T14-43-45.005575.parquet'
- config_name: results
data_files:
- split: 2024_04_05T14_43_45.005575
path:
- results_2024-04-05T14-43-45.005575.parquet
- split: latest
path:
- results_2024-04-05T14-43-45.005575.parquet
---
# Dataset Card for Evaluation run of h2oai/h2o-danube2-1.8b-base
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [h2oai/h2o-danube2-1.8b-base](https://huggingface.co/h2oai/h2o-danube2-1.8b-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_h2oai__h2o-danube2-1.8b-base",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-05T14:43:45.005575](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2o-danube2-1.8b-base/blob/main/results_2024-04-05T14-43-45.005575.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.407096217325087,
"acc_stderr": 0.03416458904719109,
"acc_norm": 0.4080909002245428,
"acc_norm_stderr": 0.03488085494267354,
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156496,
"mc2": 0.3801259785228798,
"mc2_stderr": 0.014088609967255201
},
"harness|arc:challenge|25": {
"acc": 0.3984641638225256,
"acc_stderr": 0.014306946052735562,
"acc_norm": 0.4334470989761092,
"acc_norm_stderr": 0.014481376224558896
},
"harness|hellaswag|10": {
"acc": 0.5417247560246963,
"acc_stderr": 0.004972377085916327,
"acc_norm": 0.7295359490141406,
"acc_norm_stderr": 0.004432917403755053
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3618421052631579,
"acc_stderr": 0.03910525752849724,
"acc_norm": 0.3618421052631579,
"acc_norm_stderr": 0.03910525752849724
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4188679245283019,
"acc_stderr": 0.030365050829115205,
"acc_norm": 0.4188679245283019,
"acc_norm_stderr": 0.030365050829115205
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.04016660030451233,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.04016660030451233
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3872832369942196,
"acc_stderr": 0.037143259063020656,
"acc_norm": 0.3872832369942196,
"acc_norm_stderr": 0.037143259063020656
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179962,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179962
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.35319148936170214,
"acc_stderr": 0.031245325202761926,
"acc_norm": 0.35319148936170214,
"acc_norm_stderr": 0.031245325202761926
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.023809523809523864,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.023809523809523864
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523811,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523811
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.38387096774193546,
"acc_stderr": 0.027666182075539635,
"acc_norm": 0.38387096774193546,
"acc_norm_stderr": 0.027666182075539635
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.31527093596059114,
"acc_stderr": 0.03269080871970186,
"acc_norm": 0.31527093596059114,
"acc_norm_stderr": 0.03269080871970186
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.47878787878787876,
"acc_stderr": 0.03900828913737302,
"acc_norm": 0.47878787878787876,
"acc_norm_stderr": 0.03900828913737302
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5050505050505051,
"acc_stderr": 0.035621707606254015,
"acc_norm": 0.5050505050505051,
"acc_norm_stderr": 0.035621707606254015
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5233160621761658,
"acc_stderr": 0.03604513672442202,
"acc_norm": 0.5233160621761658,
"acc_norm_stderr": 0.03604513672442202
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.38461538461538464,
"acc_stderr": 0.024666744915187222,
"acc_norm": 0.38461538461538464,
"acc_norm_stderr": 0.024666744915187222
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3487394957983193,
"acc_stderr": 0.03095663632856655,
"acc_norm": 0.3487394957983193,
"acc_norm_stderr": 0.03095663632856655
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2119205298013245,
"acc_stderr": 0.03336767086567977,
"acc_norm": 0.2119205298013245,
"acc_norm_stderr": 0.03336767086567977
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.46605504587155966,
"acc_stderr": 0.02138786335035399,
"acc_norm": 0.46605504587155966,
"acc_norm_stderr": 0.02138786335035399
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.030998666304560524,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.030998666304560524
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.03492406104163613,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.03492406104163613
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5611814345991561,
"acc_stderr": 0.032302649315470375,
"acc_norm": 0.5611814345991561,
"acc_norm_stderr": 0.032302649315470375
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5022421524663677,
"acc_stderr": 0.03355746535223263,
"acc_norm": 0.5022421524663677,
"acc_norm_stderr": 0.03355746535223263
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4961832061068702,
"acc_stderr": 0.04385162325601553,
"acc_norm": 0.4961832061068702,
"acc_norm_stderr": 0.04385162325601553
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5537190082644629,
"acc_stderr": 0.0453793517794788,
"acc_norm": 0.5537190082644629,
"acc_norm_stderr": 0.0453793517794788
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5,
"acc_stderr": 0.04833682445228318,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04833682445228318
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4049079754601227,
"acc_stderr": 0.03856672163548913,
"acc_norm": 0.4049079754601227,
"acc_norm_stderr": 0.03856672163548913
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.043270409325787296,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.043270409325787296
},
"harness|hendrycksTest-management|5": {
"acc": 0.5048543689320388,
"acc_stderr": 0.04950504382128921,
"acc_norm": 0.5048543689320388,
"acc_norm_stderr": 0.04950504382128921
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.03193705726200293,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.03193705726200293
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5772669220945083,
"acc_stderr": 0.017665180351954062,
"acc_norm": 0.5772669220945083,
"acc_norm_stderr": 0.017665180351954062
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4479768786127168,
"acc_stderr": 0.02677299065336182,
"acc_norm": 0.4479768786127168,
"acc_norm_stderr": 0.02677299065336182
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27150837988826815,
"acc_stderr": 0.014874252168095277,
"acc_norm": 0.27150837988826815,
"acc_norm_stderr": 0.014874252168095277
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.028491993586171566,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.028491993586171566
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4533762057877814,
"acc_stderr": 0.02827435985489425,
"acc_norm": 0.4533762057877814,
"acc_norm_stderr": 0.02827435985489425
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.44753086419753085,
"acc_stderr": 0.027667138569422697,
"acc_norm": 0.44753086419753085,
"acc_norm_stderr": 0.027667138569422697
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3049645390070922,
"acc_stderr": 0.02746470844202213,
"acc_norm": 0.3049645390070922,
"acc_norm_stderr": 0.02746470844202213
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3396349413298566,
"acc_stderr": 0.012095592506931974,
"acc_norm": 0.3396349413298566,
"acc_norm_stderr": 0.012095592506931974
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.27205882352941174,
"acc_stderr": 0.027033041151681456,
"acc_norm": 0.27205882352941174,
"acc_norm_stderr": 0.027033041151681456
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.39869281045751637,
"acc_stderr": 0.01980828131744985,
"acc_norm": 0.39869281045751637,
"acc_norm_stderr": 0.01980828131744985
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4636363636363636,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.4636363636363636,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.30612244897959184,
"acc_stderr": 0.029504896454595957,
"acc_norm": 0.30612244897959184,
"acc_norm_stderr": 0.029504896454595957
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.4925373134328358,
"acc_stderr": 0.03535140084276719,
"acc_norm": 0.4925373134328358,
"acc_norm_stderr": 0.03535140084276719
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.03789134424611549,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.03789134424611549
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5321637426900585,
"acc_stderr": 0.038268824176603704,
"acc_norm": 0.5321637426900585,
"acc_norm_stderr": 0.038268824176603704
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156496,
"mc2": 0.3801259785228798,
"mc2_stderr": 0.014088609967255201
},
"harness|winogrande|5": {
"acc": 0.6803472770323599,
"acc_stderr": 0.013106528517665137
},
"harness|gsm8k|5": {
"acc": 0.2979529946929492,
"acc_stderr": 0.012597932232914513
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
xezpeleta/oasst2_top1_sharegpt_format | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: langs
dtype: string
splits:
- name: train
num_bytes: 17250318
num_examples: 10094
download_size: 9787675
dataset_size: 17250318
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- en
- es
- fr
- eu
size_categories:
- n<1K
--- |
SamaAI/sama-drives-california | ---
dataset_info:
features:
- name: fname
dtype: string
- name: path
dtype: string
- name: label
struct:
- name: attributes
struct:
- name: timeofday
dtype: string
- name: weather
dtype: string
- name: labels
list:
- name: attributes
struct:
- name: drivingConditions
dtype: string
- name: laneChange
dtype: string
- name: occluded
dtype: bool
- name: box2d
struct:
- name: x1
dtype: int64
- name: x2
dtype: int64
- name: y1
dtype: int64
- name: y2
dtype: int64
- name: category
dtype: string
- name: id
dtype: int64
- name: manualAttributes
dtype: bool
- name: manualShape
dtype: bool
- name: poly2d
list:
- name: closed
dtype: bool
- name: filled
dtype: bool
- name: vertices
sequence:
sequence: int64
- name: name
dtype: string
- name: img
dtype: image
splits:
- name: train
num_bytes: 1088252764.96
num_examples: 25136
download_size: 1025635407
dataset_size: 1088252764.96
license: cc-by-4.0
size_categories:
- 10K<n<100K
---
# Dataset Card for sama-drives-california

## Dataset Description
- **Homepage:** www.sama.com
- **Point of Contact:** datasets@samasource.org
### Dataset Summary
This is an object detection dataset (bounding boxes and polygons) of **25 136 frames** (848x480 pixels) taken by a dashboard video camera of a car driving in California.
The frames were captured at 1 FPS, and hence the entire footage covers over 7 hours of driving.
All but 110 frames contain at least one annotated object (25 026) of interest.
## Dataset Structure
### Data Instances
The dataset is saved according to the `bdd100k` format described [here](https://doc.bdd100k.com/format.html#segmentation-formats) (no affiliation with Sama).
Frames are named according to the original video they are from, along with the sequence index in that video (1-indexed): **videoNumber-frameIndex.jpg** \
(e.g., 099-002.jpg for the second frame of the 99th video)
`label:id`s are used to denote unique objects, such as a specific vehicle, throughout an entire video, but not across videos.
The first digits of a `label:id` denote what video it is from (e.g., the `id` 53002 comes from video 53).
Frames were taken from videos that were recorded in a continuous sequence without any time gap in between videos. However, some videos were not included \
in the final dataset either because they contained sensitive information or because they were part of a long sequence when the car was parked and facing a scene of no interest.
The labelling format and different classes supported are described in the section Data Fields below.
Sample annotation:
```json
{
"name": "001-019.jpg",
"attributes": {"weather": "Sunny", "timeofday": "Day"},
"labels":
[
{"category": "Drivable Space", "attributes": {"occluded": true}, "manualShape": true, "manualAttributes": true, "id": 1001, "poly2d": [{"vertices": [[369, 296], [370, 276], [389, 277], [432, 278], [494, 279], [504, 266], [563, 262], [590, 270], [656, 271], [705, 276], [776, 270], [847, 274], [847, 337], [847, 419], [766, 408], [681, 402], [626, 400], [550, 393], [507, 391], [426, 390], [321, 387], [242, 394], [206, 402], [170, 402], [135, 399], [72, 405], [29, 413], [0, 418], [0, 259], [66, 259], [91, 267], [154, 265], [126, 280], [145, 288], [188, 284], [155, 265], [187, 265], [225, 263], [309, 260], [301, 271], [345, 272], [370, 276], [369, 296], [306, 300], [225, 300], [226, 312], [309, 334], [416, 353], [552, 373], [635, 375], [669, 365], [666, 343], [654, 338], [542, 313]], "closed": true, "filled": true}], "box2d": {"x1": 0, "y1": 259, "x2": 847, "y2": 419}},
{"category": "Vehicles | Truck", "attributes": {"occluded": true}, "manualShape": true, "manualAttributes": true, "id": 1041, "poly2d": [{"vertices": [[708, 247], [692, 247], [688, 251], [687, 258], [687, 265], [709, 265], [714, 265], [713, 255]], "closed": true, "filled": true}], "box2d": {"x1": 687, "y1": 247, "x2": 714, "y2": 265}},
{"category": "Vehicles | Truck", "attributes": {"occluded": true}, "manualShape": true, "manualAttributes": true, "id": 1043, "poly2d": [{"vertices": [[468, 238], [486, 251], [494, 253], [500, 257], [507, 258], [515, 262], [527, 267], [530, 278], [531, 293], [503, 300], [482, 299], [425, 291], [426, 296], [415, 298], [409, 291], [391, 288], [390, 299], [375, 300], [369, 289], [353, 284], [354, 254], [409, 256], [424, 238]], "closed": true, "filled": true}], "box2d": {"x1": 353, "y1": 238, "x2": 531, "y2": 300}},
{"category": "Vehicles | Car", "attributes": {"occluded": true}, "manualShape": true, "manualAttributes": true, "id": 1044, "poly2d": [{"vertices": [[560, 256], [539, 253], [541, 257], [553, 264], [561, 271], [563, 288], [568, 288], [584, 290], [596, 288], [599, 277], [595, 271], [589, 267], [577, 264], [570, 260]], "closed": true, "filled": true}], "box2d": {"x1": 539, "y1": 253, "x2": 599, "y2": 290}},
{"category": "Vehicles | Car", "attributes": {"occluded": true}, "manualShape": true, "manualAttributes": true, "id": 1045, "poly2d": [{"vertices": [[507, 246], [499, 247], [495, 248], [506, 255], [523, 262], [526, 270], [532, 281], [530, 295], [547, 296], [565, 294], [562, 271], [551, 261], [537, 254], [519, 251]], "closed": true, "filled": true}], "box2d": {"x1": 495, "y1": 246, "x2": 565, "y2": 296}},
{"category": "Vehicles | Car", "attributes": {"occluded": false, "drivingConditions": "Light Traffic"}, "manualShape": true, "manualAttributes": true, "id": 1046, "poly2d": [{"vertices": [[30, 249], [14, 249], [9, 256], [8, 262], [10, 271], [13, 271], [13, 269], [24, 269], [24, 271], [30, 271], [32, 268], [36, 268], [38, 271], [41, 269], [41, 263], [40, 256], [37, 252], [34, 250]], "closed": true, "filled": true}], "box2d": {"x1": 8, "y1": 249, "x2": 41, "y2": 271}}
]
}
```
### Data Fields
Each frame contains a label for `timeofday` and `weather`. `Dusk`, `Dawn` and `Twilight` all fall in the same `timeofday` category.
| timeofday | weather |
|:--------------------|:--------|
| Day | Sunny |
| Night | Cloudy |
| Dusk/Dawn/Twilight | Rainy |
| | Snowy |
| | Other |
Bounding boxes are provided for all objects as `box2d`.
`Vehicles`, `People` and `Areas` are also identified with closed `Polygons` of the type `poly2d`.
`Lanes` are available as `Lines`, that are denoted as open `Polygons` of the type `poly2d`.
`Traffic Lights` and `Traffic Signs` are only available as `Bounding Boxes`.
| Vehicles (Polygons) | People (Polygons) | Areas (Polygons) | Lanes (Lines) | Traffic (Bounding Boxes) |
|:----------------------|:----------------------|:-------------------|:------------------|:--------------------------|
| Car | Pedestrians | Drivable Space | Current Lane | Traffic Lights |
| Truck | | | Alternate Lane | Traffic Signs |
| Van | | | Opposite Lane | |
| SUV | | | | |
| Bus | | | | |
| Other LV | | | | |
| Bicycles | | | | |
| Motorbikes | | | | |
The objects above can each be `occluded` (true) or not (false).
`Vehicles` also have a label called `drivingConditions` that denotes the amount of vehicle traffic they are facing.
Note that this label is not always present.
| drivingConditions (for Vehicles) |
|:------------------------------------|
| Light Traffic |
| Moderate Traffic |
| Heavy Traffic |
`Lanes` also contain a laneChange label. Note that this label is not always present.
| laneChange (for Lanes) |
|:---------------------------|
| Current |
| Alternate |
| Opposite |
### Visualize Dataset
To visualize the dataset on the [FiftyOne](https://docs.voxel51.com/) app, download and unzip the following [zip file](https://sama-documentation-assets.s3.amazonaws.com/sama-drives-california/zipped/sama-drives-california.zip) (2.3GB).
```python
import fiftyone as fo
# <dataset_dir>/
# labels.json
# data/
# 001-001.jpg
# 001-002.jpg
# ...
name = "sama-drives-california"
dataset_dir = "/path/to/dataset"
# Create the dataset
dataset = fo.Dataset.from_dir(
dataset_dir=dataset_dir,
dataset_type=fo.types.BDDDataset,
name=name
)
```
### Dataset in Video Format
This dataset is also available as a video dataset with [FiftyOne](https://docs.voxel51.com/) style label format. You can download a zipped file of the dataset (videos and fiftyone labels) [here](https://sama-documentation-assets.s3.amazonaws.com/sama-drives-california/zipped/sama-drives-california-videos.zip) (1.1GB).
```python
import fiftyone as fo
# <video_dataset_dir>/
# frames.json
# metadata.json
# samples.json
# data/
# 001.mp4
# 002.mp4
# ...
name = "sama-drives-california-videos"
dataset_dir = "/path/to/videos-dataset"
# Create the dataset
dataset = fo.Dataset.from_dir(
dataset_dir=dataset_dir,
dataset_type=fo.types.FiftyOneDataset,
name=name
)
```
### Annotations
The dataset was annotated by a team of Sama Associates.
They were instructed to annotate all objects of the classes described in the section *Data Fields* above with the following details:
* Ignore objects under 10 pixels in width or height.
* Annotate with a pixel tolerance of 2 pixels.
* For motorized vehicles, include the mirrors but do not include the antennas.
* For bicycles, include the cyclist.
* For motorbikes, include the rider.
* For traffic lights, place the bounding box around the light fixture but not the pole.
* For traffic signs, do not include the pole or structure.
### Personal and Sensitive Information
All personal and sensitive information has been removed. Vehicle license plates and faces are blurred.
### Other Known Limitations
Objects of interest that were smaller than 10 pixels in width or height were not annotated.
### Licensing Information
(CC BY 4.0) [https://creativecommons.org/licenses/by/4.0/] |
tollefj/nor-instruct-combined | ---
language:
- nb
dataset_info:
features:
- name: instruction
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 32962041
num_examples: 67714
- name: test
num_bytes: 322359
num_examples: 684
download_size: 21130799
dataset_size: 33284400
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
A concatenated instruction-based dataset from the following:
- NbAiLab/norwegian-alpaca
- RuterNorway/Fleurs-Alpaca-EN-NO
- RuterNorway/OpenOrcaNo-15k |
PerceptionEval/Realness | ---
dataset_info:
features:
- name: idx
dtype: int32
- name: question
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: choices
sequence: string
- name: answer
dtype: string
- name: prompt
dtype: string
- name: image1_label
dtype:
class_label:
names:
'0': fake
'1': real
- name: image2_label
dtype:
class_label:
names:
'0': fake
'1': real
- name: image3_label
dtype:
class_label:
names:
'0': fake
'1': real
- name: image4_label
dtype:
class_label:
names:
'0': fake
'1': real
splits:
- name: test
num_bytes: 19749619.0
num_examples: 132
- name: val
num_bytes: 19625645.0
num_examples: 132
download_size: 39275882
dataset_size: 39375264.0
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: val
path: data/val-*
---
|
mangeshdiyewar/vivekanada | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3975158
num_examples: 227
download_size: 2261786
dataset_size: 3975158
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MatsuoDochiai/Leoncio | ---
license: openrail
---
|
Mohannad/test_NextAudioGen_uvr | ---
license: mit
---
|
AstraMindAI/RLAIF-Nectar | ---
license: cc-by-nc-4.0
--- |
CyberHarem/m1887_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of m1887/M1887/M1887 (Girls' Frontline)
This is the dataset of m1887/M1887/M1887 (Girls' Frontline), containing 23 images and their tags.
The core tags of this character are `multicolored_hair, long_hair, red_eyes, red_hair, streaked_hair, breasts, earrings, brown_hair, large_breasts, hair_between_eyes, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 23 | 20.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1887_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 23 | 14.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1887_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 47 | 25.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1887_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 23 | 19.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1887_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 47 | 33.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1887_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/m1887_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, solo, looking_at_viewer, jewelry, black_jacket, cleavage_cutout, gloves, gun, black_leotard, holding, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | jewelry | black_jacket | cleavage_cutout | gloves | gun | black_leotard | holding | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:----------|:---------------|:------------------|:---------|:------|:----------------|:----------|:--------------------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X |
|
everycoffee/autotrain-data-coffee-beans | ---
task_categories:
- image-classification
---
# AutoTrain Dataset for project: coffee-beans
## Dataset Description
This dataset has been automatically processed by AutoTrain for project coffee-beans.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<224x224 RGB PIL image>",
"feat_width": 224,
"feat_height": 224,
"target": 1,
"feat_xmin": 22,
"feat_ymin": 61,
"feat_xmax": 140,
"feat_ymax": 160
},
{
"image": "<224x224 RGB PIL image>",
"feat_width": 224,
"feat_height": 224,
"target": 1,
"feat_xmin": 34,
"feat_ymin": 13,
"feat_xmax": 205,
"feat_ymax": 164
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"feat_width": "Value(dtype='int64', id=None)",
"feat_height": "Value(dtype='int64', id=None)",
"target": "ClassLabel(names=['defect', 'good'], id=None)",
"feat_xmin": "Value(dtype='int64', id=None)",
"feat_ymin": "Value(dtype='int64', id=None)",
"feat_xmax": "Value(dtype='int64', id=None)",
"feat_ymax": "Value(dtype='int64', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 3348 |
| valid | 1237 |
|
YuehHanChen/forecasting | ---
license: apache-2.0
---
<p align="center"><h1>Dataset from "Approaching Human-Level Forecasting with Language Models"</h1></p>
<p>This document details the curated dataset developed for our research paper, <strong><a href="https://arxiv.org/abs/2402.18563" target="_blank">Approaching Human-Level Forecasting with Language Models</a></strong>, authored by <a href="mailto:dhalawi@berkeley.edu">Danny Halawi</a>, <a href="mailto:z0@eecs.berkeley.edu">Fred Zhang</a>, <a href="mailto:john0922ucb@berkeley.edu">Chen Yueh-Han</a>, and <a href="mailto:jsteinhardt@berkeley.edu">Jacob Steinhardt</a>.</p>
<h2>Data Source and Format</h2>
<p>The dataset is compiled from forecasting platforms including Metaculus, Good Judgment Open, INFER, Polymarket, and Manifold. These platforms enable users to predict future events by assigning probabilities to different outcomes, structured as follows:</p>
<ul>
<li><strong>Background Description:</strong> Contextual information for each forecasting question.</li>
<li><strong>Resolution Criterion:</strong> Guidelines on how and when each question is considered resolved.</li>
<li><strong>Timestamps:</strong> Key dates including the publication (begin date), forecast submission deadline (close date), and outcome resolution (resolve date).</li>
</ul>
<p>Submissions are accepted between the begin date and the earlier of the resolve or close dates. See <em>Table 1</em> in our paper for an in-depth example.</p>
<h2>Raw Data Composition</h2>
<p>The raw dataset encompasses 48,754 questions and 7,174,607 user forecasts from 2015 to 2024, across various question types and topics globally. However, it includes challenges such as ill-defined questions and a significant imbalance in source platform contributions post-June 1, 2023. For a complete view of the raw data, visit <a href="https://huggingface.co/datasets/YuehHanChen/forecasting_raw" target="_blank">our dataset on Hugging Face</a>.</p>
<h2>Data Curation Process</h2>
<p>To refine the dataset for analytical rigor, we undertook the following steps:</p>
<ul>
<li><strong>Filtering:</strong> Exclusion of ill-defined, overly personal, or niche-interest questions to ensure data quality and relevance.</li>
<li><strong>Adjustment for Imbalance:</strong> Careful selection to mitigate the recent source imbalance, focusing on a diverse representation of forecasting questions.</li>
<li><strong>Binary Focus:</strong> Conversion of multiple-choice questions to binary format, concentrating on binary outcomes for a streamlined analysis.</li>
<li><strong>Temporal Segregation:</strong> To prevent leakage from language models' pre-training, the test set includes only questions published after June 1, 2024, with earlier questions allocated to training and validation sets.</li>
</ul>
<p>This curation resulted in 5,516 binary questions, with 3,762 for training, 840 for validation, and 914 for testing. Detailed examples and curation insights are provided in <em>Table 2a</em> and <em>Appendix C</em> of our paper.</p>
<h2>Significance for Research</h2>
<p>The curated dataset is pivotal for our investigation into language models' forecasting capabilities, aiming to benchmark against or exceed human predictive performance. It enables focused analysis on high-quality, relevant forecasting questions.</p>
<p>Detailed methodologies and insights from our study are available in the linked paper at the beginning of this document. We invite feedback and collaboration to further this field of research.</p>
<h2>How to Cite</h2>
<p>If you find our dataset and research useful for your work, please cite it using the following BibTeX entry:</p>
```bibtex
@misc{halawi2024approaching,
title={Approaching Human-Level Forecasting with Language Models},
author={Danny Halawi and Fred Zhang and Chen Yueh-Han and Jacob Steinhardt},
year={2024},
eprint={2402.18563},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
|
mobinx/Quran_ASR | ---
license: apache-2.0
---
|
autoevaluate/autoeval-eval-conll2003-conll2003-df31a4-1679759345 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- conll2003
eval_info:
task: entity_extraction
model: bhadresh-savani/electra-base-discriminator-finetuned-conll03-english
metrics: []
dataset_name: conll2003
dataset_config: conll2003
dataset_split: test
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: bhadresh-savani/electra-base-discriminator-finetuned-conll03-english
* Dataset: conll2003
* Config: conll2003
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@opfaffel@gmail.com](https://huggingface.co/opfaffel@gmail.com) for evaluating this model. |
andersonbcdefg/synthetic_retrieval_tasks | ---
license: mit
tags:
- synthetic
---
Synthetic data designed as prompts for generating embeddings training data for retrieval.
The "iteration" column refers to how the data was generated.
Iteration 1: Use the following pool of seed tasks, prompt GPT-3.5-Turbo to generate additional tasks.
```python
RETRIEVAL_EXAMPLES = [
'Provide a scientific claim as query, retrieve documents that help verify or refute the claim.',
'Search for documents that answers a FAQ-style query on children\'s nutrition.',
"Retrieve company's financial reports for a given stock ticker symbol.",
"Given a book name as a query, retrieve reviews, ratings and summaries of that book.",
"Search for scientific research papers supporting a medical diagnosis for a specified disease.",
"Given a question, retrieve Wikipedia passages that answer the question.",
"Provided a user question, retrieve the highest voted answers on Reddit ELI5 forum.",
"Given a web search engine query, retrieve relevant passages that answer the query.",
"Find Amazon reviews similar to the input review.",
"Find the song lyrics most related to the user's search.",
"Given a multi-hop question, retrieve documents that can help answer the question.",
"Retrieve tweets that are semantically similar to the given tweet",
"Given a news summary, retrieve other semantically similar summaries",
"Given a question, retrieve relevant answers from Stackexchange",
"Given a scientific paper title, retrieve paper abstracts that are cited by the given paper."
]
```
Iteration 2: Use the ~40,000 tasks generated in Iteration 1 as seed tasks, prompt GPT-3.5-Turbo to generate additional tasks.
Iteration 3: Use the ~80,000 tasks generated in Iterations 1-2 as seed tasks, prompt GPT-4-Turbo to generate additional tasks. |
Nahrawy/VIDIT-FAID-Depth-ControlNet | ---
dataset_info:
features:
- name: scene
dtype: string
- name: image
dtype: image
- name: depth_map
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 32894078944.7
num_examples: 17550
download_size: 32257586708
dataset_size: 32894078944.7
---
# Dataset Card for "VIDIT-FAID-Depth-ControlNet"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
deepghs/artists_index | ---
license: mit
---
|
achandlr/BatchPrompting | ---
language: en
license:
- mit
tags:
- natural-language-processing
- machine-learning
---
# Batch Prompting Dataset for Fine-Tuning Large Language Models
## Overview
The Batch Prompting Dataset is a comprehensive collection of text-based question-answer pairs designed to fine-tune and evaluate the performance of large language models (LLMs) across a diverse range of tasks. This dataset aims to facilitate research and development in the field of natural language processing (NLP) by providing a standardized benchmark for assessing the capabilities of LLMs in various domains, such as commonsense reasoning, textual entailment, sentiment analysis, and question answering.
### Dataset Description
The dataset consists of a wide array of tasks sourced from popular NLP benchmarks, including but not limited to:
- Entire Glue Benchmark
- Recognizing Textual Entailment (RTE)
- Multi-Genre Natural Language Inference (MNLI)
- Corpus of Linguistic Acceptability (CoLA)
- Stanford Sentiment Treebank (SST-2)
- Microsoft Research Paraphrase Corpus (MRPC)
- Semantic Textual Similarity Benchmark (STS-B)
- Quora Question Pairs (QQP)
- Question NLI (QNLI)
- Winograd NLI (WNLI)
- Grade School Math (GSM8K)
- CommonsenseQA
- RACE (Large-scale reading comprehension dataset)
Each task is accompanied by a clear description, specifying the objective, input format, and expected output format. The dataset provides a mix of classification, multiple-choice, and open-ended questions, allowing for a comprehensive evaluation of an LLM's performance across different problem types.
The question-answer pairs are organized into batches, enabling efficient fine-tuning and evaluation of LLMs in a batched setting. This batch prompting approach allows for more effective utilization of computational resources and faster training times compared to single-instance prompting.
In addition to the question-answer pairs, the dataset includes metadata such as task type, difficulty level, and source dataset. This information can be used to filter and analyze the performance of LLMs based on specific criteria.
### Dataset Structure
The dataset is provided in a structured format, with each row containing the following fields:
input: The input text or question.
output: The corresponding output or answer.
k_shot_size: The number of few-shot examples provided for the task. (Range from 1-6)
batch_size: The number of question-answer pairs in the batch. (Range from 0-6)
task: The name of the task or source dataset.
split: The data split (e.g., train, validation, test).
text: The full text of the question-answer pair, including any additional context or instructions.
The dataset is split into training, validation, and test sets, allowing for proper evaluation and comparison of LLM performance. The training set can be used to fine-tune the models, while the validation and test sets serve as held-out data for assessing generalization and performance on unseen examples.
### Applications
The Batch Prompting Dataset is designed to support a wide range of applications in NLP research and development, including:
Fine-tuning and evaluating LLMs for specific tasks or domains
Comparing the performance of different LLM architectures and training strategies
Investigating the impact of batch size and few-shot learning on LLM performance
Analyzing the transferability of knowledge across tasks and domains
Developing new prompting techniques and strategies for improved LLM performance
### Usage
To use the Batch Prompting Dataset, simply load the dataset using the Hugging Face datasets library:
from datasets import load_dataset
dataset = load_dataset("achandlr", "batch_prompting_dataset")
The dataset can then be easily integrated into existing NLP pipelines and frameworks for fine-tuning and evaluation of LLMs.
<!--
Citation
If you use the Batch Prompting Dataset in your research or development work, please cite the following paper:
Copy code
@misc{batch_prompting_dataset_2023,
title={Batch Prompting Dataset for Fine-Tuning Large Language Models},
author={Author Name},
year={2023},
howpublished={\url{https://huggingface.co/datasets/batch_prompting_dataset}}
} -->
<!-- License
The Batch Prompting Dataset is released under the Creative Commons Attribution 4.0 International (CC BY 4.0) license. By using this dataset, you agree to comply with the terms and conditions of the license. -->
### Contact
For questions, feedback, or support, please contact the dataset maintainer at alex.chandler@utexas.edu.
We hope that the Batch Prompting Dataset contributes to the advancement of NLP research and the development of more capable and robust language models. Happy fine-tuning! |
jxie/higgs-normalized | ---
dataset_info:
features:
- name: label
dtype: float64
- name: inputs
sequence: float64
splits:
- name: train
num_bytes: 2478000000
num_examples: 10500000
- name: test
num_bytes: 118000000
num_examples: 500000
- name: train_1k
num_bytes: 236000
num_examples: 1000
- name: train_10k
num_bytes: 2360000
num_examples: 10000
- name: train_100k
num_bytes: 23600000
num_examples: 100000
download_size: 2144173073
dataset_size: 2622196000
---
# Dataset Card for "higgs-normalized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Saleh11623/googlestore | ---
task_categories:
- question-answering
tags:
- not-for-all-audiences
--- |
zh-tw-llm-dv-dv/zh-tw-llm-dev-sample-ta8k-d40d11-only_embeddings-tr__alp-a1a0fd-c2048 | ---
dataset_info:
dataset_size: 1917406.0
download_size: 623887
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
- dtype: string
name: preview
splits:
- name: train
num_bytes: 1917406.0
num_examples: 400
---
# zh-tw-llm-dev-sample-ta8k-d40d11-only_embeddings-tr__alp-a1a0fd-c2048
This dataset is a part of the `zh-tw-llm-dev` project.
* Tokenizer: `zh-tw-llm-dev-tokenizer-a8k-d40d11`
* Built with: `translations`, `wikipedia`, `alpaca`
* Rows: `400`
* Max length: `2048`
* Full config:
```json
{"build_with": ["translations", "wikipedia", "alpaca"], "preview_length": 256, "translations_settings": {"source_dataset": "zetavg/coct-en-zh-tw-translations-twp-300k", "lang_1_key": "en", "lang_2_key": "ch", "templates": ["English: {lang_1}\nChinese: {lang_2}", "Chinese: {lang_2}\nEnglish: {lang_1}"], "rows_limit": 100}, "wikipedia_settings": {"source_dataset": "zetavg/zh-tw-wikipedia", "exclude": [{"content_length_longer_than": 512}, {"match": "小行星", "in": "markdown", "in_range": [0, 40]}, {"match": "是中華人民共和國", "in": "markdown", "in_range": [0, 80]}], "rows_limit": 100}, "alpaca_settings": {"source_dataset": "zetavg/traditional-chinese-alpaca-en-align", "template": "short", "train_on_inputs": false, "rows_limit": 100}}
``` |
wanadzhar913/crawl-mat-gaming | ---
license: apache-2.0
language:
- ms
---
# TLDR
* Website: [mat-gaming](https://mat-gaming.com/)
* Num. pages scraped: 49
* Remaining pages: 0
* Date of scraping: 4th August 2023
* Text data language: Bahasa Melayu
* Contributed to: https://github.com/huseinzol05/malaysian-dataset
* Pull request: https://github.com/huseinzol05/malaysian-dataset/pull/242 |
mit-han-lab/vww-s256 | ---
license: mit
---
|
polm-stability/jblimp | ---
language:
- ja
---
# JBLiMP
This is the data from "JBLiMP: Japanese Benchmark of Linguistic Minimal Pairs" (Someya and Oseki, 2023). Only the validated pairs used for benchmarks are included, and only in JSONL format, since it's redundant with the TSV.
For details see [the original git repo](https://github.com/osekilab/JBLiMP) or [the paper](https://aclanthology.org/2023.findings-eacl.117/).
|
CVasNLPExperiments/VQAv2_sample_validation_google_flan_t5_xl_mode_C_Q_rices_ns_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large__
num_bytes: 141084
num_examples: 1000
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large_clean_
num_bytes: 141084
num_examples: 1000
download_size: 104516
dataset_size: 282168
---
# Dataset Card for "VQAv2_sample_validation_google_flan_t5_xl_mode_C_Q_rices_ns_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hippocrates/HealthCareMagic_train | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 202222080
num_examples: 112165
download_size: 112037983
dataset_size: 202222080
---
# Dataset Card for "HealthCareMagic_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pnadel/met-ds-0 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 609545229.0
num_examples: 628
download_size: 607994410
dataset_size: 609545229.0
---
# Dataset Card for "met-ds-0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Milanesero/homero_simpson | ---
license: bigscience-openrail-m
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.