datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
liuyanchen1015/MULTI_VALUE_rte_yall | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 22293
num_examples: 42
- name: train
num_bytes: 17700
num_examples: 34
download_size: 36392
dataset_size: 39993
---
# Dataset Card for "MULTI_VALUE_rte_yall"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Klarks/naruto | ---
license: afl-3.0
---
|
BevenRozario/job_desc_3k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train_dataset
num_bytes: 5388093.9
num_examples: 2700
- name: eval_dataset
num_bytes: 598677.1
num_examples: 300
download_size: 1640958
dataset_size: 5986771.0
configs:
- config_name: default
data_files:
- split: train_dataset
path: data/train_dataset-*
- split: eval_dataset
path: data/eval_dataset-*
---
|
Tongjilibo/BD_Knowledge_Extraction | ---
license: apache-2.0
---
# 百度关系提取数据集
- 官网: http://ai.baidu.com/broad/download?dataset=sked |
Betilo/dalva98 | ---
license: openrail
---
|
AdapterOcean/dollyaug-standardized_cluster_2_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 2928085
num_examples: 3106
download_size: 1723057
dataset_size: 2928085
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "dollyaug-standardized_cluster_2_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/sajou_yukimi_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of sajou_yukimi/佐城雪美 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of sajou_yukimi/佐城雪美 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags.
The core tags of this character are `long_hair, red_eyes, bangs, blue_hair, blunt_bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:------------|:----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 572.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sajou_yukimi_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 350.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sajou_yukimi_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1177 | 740.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sajou_yukimi_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 513.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sajou_yukimi_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1177 | 1007.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sajou_yukimi_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sajou_yukimi_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, black_skirt, black_thighhighs, long_sleeves, looking_at_viewer, red_ribbon, solo, frilled_skirt, neck_ribbon, white_background, white_shirt, blush, closed_mouth, simple_background, frilled_sleeves, very_long_hair, zettai_ryouiki, smile, wide_sleeves |
| 1 | 15 |  |  |  |  |  | 1girl, black_skirt, blush, frilled_skirt, juliet_sleeves, simple_background, solo, very_long_hair, white_background, white_shirt, black_thighhighs, looking_at_viewer, black_hair, red_ribbon, braid, wide_sleeves, striped_panties, closed_mouth, neck_ribbon, small_breasts, feet_out_of_frame, skirt_lift, :<, flying_sweatdrops, garter_straps, lifted_by_self |
| 2 | 5 |  |  |  |  |  | 1girl, black_thighhighs, blush, cat_ears, cat_girl, cat_tail, kemonomimi_mode, looking_at_viewer, paw_gloves, red_ribbon, shadow, solo, very_long_hair, white_background, wide_sleeves, all_fours, black_skirt, braid, frilled_skirt, neck_ribbon, pleated_skirt, simple_background, white_shirt, black_gloves, black_hair, juliet_sleeves, no_shoes, parted_lips, tail_raised, triangle_mouth, full_body, garter_straps |
| 3 | 25 |  |  |  |  |  | 1girl, solo, smile, dress, frills, lolita_hairband, looking_at_viewer, blush, ribbon, black_pantyhose, gothic_lolita, bow, sitting |
| 4 | 9 |  |  |  |  |  | 1girl, blush, enmaided, looking_at_viewer, maid_apron, maid_headdress, solo, smile, cat_ears, long_sleeves, white_apron, black_dress, black_footwear, frilled_dress, full_body, blue_bow, bowtie, fake_animal_ears, mary_janes, white_background |
| 5 | 5 |  |  |  |  |  | 1girl, black_hair, blush, cat_ears, open_mouth, solo, looking_at_viewer, paw_pose, heart, tail, flat_chest, navel, small_breasts, swimsuit |
| 6 | 5 |  |  |  |  |  | 1girl, blush, looking_at_viewer, navel, side-tie_bikini_bottom, solo, simple_background, white_background, cat_ears, cat_tail, front-tie_top, white_bikini, ass_visible_through_thighs, breasts, cameltoe, flat_chest, micro_bikini, open_mouth, thigh_gap |
| 7 | 5 |  |  |  |  |  | 1girl, blush, looking_at_viewer, solo, collarbone, school_swimsuit, simple_background, white_background, flat_chest, smile, blue_one-piece_swimsuit, closed_mouth, covered_navel, sitting, small_breasts, wet |
| 8 | 6 |  |  |  |  |  | 1girl, frills, hair_bow, kimono, smile, solo, black_gloves, looking_at_viewer, wide_sleeves, black_cat, hair_rings, sitting, twin_braids, blush, floral_print, pantyhose, striped |
| 9 | 6 |  |  |  |  |  | 1girl, blush, looking_at_viewer, obi, smile, solo, hair_flower, wide_sleeves, floral_print, long_sleeves, black_hair, print_kimono, sidelocks |
| 10 | 6 |  |  |  |  |  | bare_shoulders, blush, detached_collar, playboy_bunny, rabbit_ears, 1girl, black_hairband, black_leotard, fake_animal_ears, looking_at_viewer, small_breasts, solo, strapless_leotard, very_long_hair, white_background, black_bowtie, brown_pantyhose, covered_navel, fishnet_pantyhose, garter_straps, simple_background, white_collar, bare_arms, closed_mouth, rabbit_tail, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_skirt | black_thighhighs | long_sleeves | looking_at_viewer | red_ribbon | solo | frilled_skirt | neck_ribbon | white_background | white_shirt | blush | closed_mouth | simple_background | frilled_sleeves | very_long_hair | zettai_ryouiki | smile | wide_sleeves | juliet_sleeves | black_hair | braid | striped_panties | small_breasts | feet_out_of_frame | skirt_lift | :< | flying_sweatdrops | garter_straps | lifted_by_self | cat_ears | cat_girl | cat_tail | kemonomimi_mode | paw_gloves | shadow | all_fours | pleated_skirt | black_gloves | no_shoes | parted_lips | tail_raised | triangle_mouth | full_body | dress | frills | lolita_hairband | ribbon | black_pantyhose | gothic_lolita | bow | sitting | enmaided | maid_apron | maid_headdress | white_apron | black_dress | black_footwear | frilled_dress | blue_bow | bowtie | fake_animal_ears | mary_janes | open_mouth | paw_pose | heart | tail | flat_chest | navel | swimsuit | side-tie_bikini_bottom | front-tie_top | white_bikini | ass_visible_through_thighs | breasts | cameltoe | micro_bikini | thigh_gap | collarbone | school_swimsuit | blue_one-piece_swimsuit | covered_navel | wet | hair_bow | kimono | black_cat | hair_rings | twin_braids | floral_print | pantyhose | striped | obi | hair_flower | print_kimono | sidelocks | bare_shoulders | detached_collar | playboy_bunny | rabbit_ears | black_hairband | black_leotard | strapless_leotard | black_bowtie | brown_pantyhose | fishnet_pantyhose | white_collar | bare_arms | rabbit_tail |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------------|:-------------------|:---------------|:--------------------|:-------------|:-------|:----------------|:--------------|:-------------------|:--------------|:--------|:---------------|:--------------------|:------------------|:-----------------|:-----------------|:--------|:---------------|:-----------------|:-------------|:--------|:------------------|:----------------|:--------------------|:-------------|:-----|:--------------------|:----------------|:-----------------|:-----------|:-----------|:-----------|:------------------|:-------------|:---------|:------------|:----------------|:---------------|:-----------|:--------------|:--------------|:-----------------|:------------|:--------|:---------|:------------------|:---------|:------------------|:----------------|:------|:----------|:-----------|:-------------|:-----------------|:--------------|:--------------|:-----------------|:----------------|:-----------|:---------|:-------------------|:-------------|:-------------|:-----------|:--------|:-------|:-------------|:--------|:-----------|:-------------------------|:----------------|:---------------|:-----------------------------|:----------|:-----------|:---------------|:------------|:-------------|:------------------|:--------------------------|:----------------|:------|:-----------|:---------|:------------|:-------------|:--------------|:---------------|:------------|:----------|:------|:--------------|:---------------|:------------|:-----------------|:------------------|:----------------|:--------------|:-----------------|:----------------|:--------------------|:---------------|:------------------|:--------------------|:---------------|:------------|:--------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 15 |  |  |  |  |  | X | X | X | | X | X | X | X | X | X | X | X | X | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | | X | X | X | X | X | X | X | X | | X | | X | | | X | X | X | X | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 25 |  |  |  |  |  | X | | | | X | | X | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | | | X | X | | X | | | X | | X | | | | | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | | X | | X | | | | | X | | | | | | | | | X | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | | | X | | X | | | X | | X | | X | | | | | | | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | | | | X | | X | | | X | | X | X | X | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | X | | | | X | | X | | | | | X | | | | | | X | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 9 | 6 |  |  |  |  |  | X | | | X | X | | X | | | | | X | | | | | | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | X | X | X | | | | | | | | | | | | | |
| 10 | 6 |  |  |  |  |  | X | | | | X | | X | | | X | | X | X | X | | X | | X | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
ricahrd/McPedrinho | ---
license: openrail
---
|
learnanything/ranking | ---
license: unknown
---
|
NBayer/test_6_rows | ---
license: openrail
---
|
nandovallec/df_ps_train_extra |
---
license: apache-2.0
---
|
parksimon0808/prm800k-llama-verifier | ---
dataset_info:
features:
- name: texts
dtype: string
- name: input_ids
sequence: int32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 4528067256
num_examples: 1052294
- name: test
num_bytes: 145143622
num_examples: 32408
download_size: 353282233
dataset_size: 4673210878
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "prm800k-llama"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_AA051610__A11P | ---
pretty_name: Evaluation run of AA051610/A11P
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AA051610/A11P](https://huggingface.co/AA051610/A11P) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051610__A11P\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-11T02:59:53.573351](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__A11P/blob/main/results_2023-12-11T02-59-53.573351.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7024107525999363,\n\
\ \"acc_stderr\": 0.030362293861859797,\n \"acc_norm\": 0.7062972608094896,\n\
\ \"acc_norm_stderr\": 0.030951825247607496,\n \"mc1\": 0.41370869033047736,\n\
\ \"mc1_stderr\": 0.0172408618120998,\n \"mc2\": 0.5644074616941972,\n\
\ \"mc2_stderr\": 0.015397066221595713\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6040955631399317,\n \"acc_stderr\": 0.014291228393536587,\n\
\ \"acc_norm\": 0.6254266211604096,\n \"acc_norm_stderr\": 0.014144193471893449\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6191993626767576,\n\
\ \"acc_stderr\": 0.004845912857338663,\n \"acc_norm\": 0.8253335988846843,\n\
\ \"acc_norm_stderr\": 0.003789055487003176\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8355263157894737,\n \"acc_stderr\": 0.030167533468632726,\n\
\ \"acc_norm\": 0.8355263157894737,\n \"acc_norm_stderr\": 0.030167533468632726\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n\
\ \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7358490566037735,\n \"acc_stderr\": 0.027134291628741706,\n\
\ \"acc_norm\": 0.7358490566037735,\n \"acc_norm_stderr\": 0.027134291628741706\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n\
\ \"acc_stderr\": 0.03396116205845335,\n \"acc_norm\": 0.7916666666666666,\n\
\ \"acc_norm_stderr\": 0.03396116205845335\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n\
\ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n\
\ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7063829787234043,\n \"acc_stderr\": 0.02977164271249123,\n\
\ \"acc_norm\": 0.7063829787234043,\n \"acc_norm_stderr\": 0.02977164271249123\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5526315789473685,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.5526315789473685,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7310344827586207,\n \"acc_stderr\": 0.036951833116502325,\n\
\ \"acc_norm\": 0.7310344827586207,\n \"acc_norm_stderr\": 0.036951833116502325\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5476190476190477,\n \"acc_stderr\": 0.025634258115554955,\n \"\
acc_norm\": 0.5476190476190477,\n \"acc_norm_stderr\": 0.025634258115554955\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8548387096774194,\n\
\ \"acc_stderr\": 0.02003956362805328,\n \"acc_norm\": 0.8548387096774194,\n\
\ \"acc_norm_stderr\": 0.02003956362805328\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.034991131376767445,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.034991131376767445\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656208,\n\
\ \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656208\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8787878787878788,\n \"acc_stderr\": 0.02325315795194208,\n \"\
acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.02325315795194208\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9222797927461139,\n \"acc_stderr\": 0.019321805557223144,\n\
\ \"acc_norm\": 0.9222797927461139,\n \"acc_norm_stderr\": 0.019321805557223144\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7692307692307693,\n \"acc_stderr\": 0.02136202772522272,\n \
\ \"acc_norm\": 0.7692307692307693,\n \"acc_norm_stderr\": 0.02136202772522272\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251976,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251976\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7857142857142857,\n \"acc_stderr\": 0.026653531596715484,\n\
\ \"acc_norm\": 0.7857142857142857,\n \"acc_norm_stderr\": 0.026653531596715484\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.40397350993377484,\n \"acc_stderr\": 0.040064856853653415,\n \"\
acc_norm\": 0.40397350993377484,\n \"acc_norm_stderr\": 0.040064856853653415\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8825688073394495,\n \"acc_stderr\": 0.01380278022737734,\n \"\
acc_norm\": 0.8825688073394495,\n \"acc_norm_stderr\": 0.01380278022737734\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8970588235294118,\n\
\ \"acc_stderr\": 0.021328337570804365,\n \"acc_norm\": 0.8970588235294118,\n\
\ \"acc_norm_stderr\": 0.021328337570804365\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.8734177215189873,\n \"acc_stderr\": 0.021644195727955173,\n\
\ \"acc_norm\": 0.8734177215189873,\n \"acc_norm_stderr\": 0.021644195727955173\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7802690582959642,\n\
\ \"acc_stderr\": 0.02779017706438359,\n \"acc_norm\": 0.7802690582959642,\n\
\ \"acc_norm_stderr\": 0.02779017706438359\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8549618320610687,\n \"acc_stderr\": 0.03088466108951539,\n\
\ \"acc_norm\": 0.8549618320610687,\n \"acc_norm_stderr\": 0.03088466108951539\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8429752066115702,\n \"acc_stderr\": 0.03321244842547129,\n \"\
acc_norm\": 0.8429752066115702,\n \"acc_norm_stderr\": 0.03321244842547129\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8703703703703703,\n\
\ \"acc_stderr\": 0.032472243899179465,\n \"acc_norm\": 0.8703703703703703,\n\
\ \"acc_norm_stderr\": 0.032472243899179465\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8282208588957055,\n \"acc_stderr\": 0.029634717272371047,\n\
\ \"acc_norm\": 0.8282208588957055,\n \"acc_norm_stderr\": 0.029634717272371047\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n\
\ \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.6160714285714286,\n\
\ \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.0376017800602662,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.0376017800602662\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.905982905982906,\n\
\ \"acc_stderr\": 0.019119892798924978,\n \"acc_norm\": 0.905982905982906,\n\
\ \"acc_norm_stderr\": 0.019119892798924978\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653695,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653695\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8850574712643678,\n\
\ \"acc_stderr\": 0.01140572072459397,\n \"acc_norm\": 0.8850574712643678,\n\
\ \"acc_norm_stderr\": 0.01140572072459397\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7774566473988439,\n \"acc_stderr\": 0.02239421566194282,\n\
\ \"acc_norm\": 0.7774566473988439,\n \"acc_norm_stderr\": 0.02239421566194282\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42793296089385474,\n\
\ \"acc_stderr\": 0.01654788799741611,\n \"acc_norm\": 0.42793296089385474,\n\
\ \"acc_norm_stderr\": 0.01654788799741611\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7810457516339869,\n \"acc_stderr\": 0.02367908986180772,\n\
\ \"acc_norm\": 0.7810457516339869,\n \"acc_norm_stderr\": 0.02367908986180772\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7717041800643086,\n\
\ \"acc_stderr\": 0.02383930331139819,\n \"acc_norm\": 0.7717041800643086,\n\
\ \"acc_norm_stderr\": 0.02383930331139819\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.023016705640262185,\n\
\ \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.023016705640262185\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5425531914893617,\n \"acc_stderr\": 0.029719281272236844,\n \
\ \"acc_norm\": 0.5425531914893617,\n \"acc_norm_stderr\": 0.029719281272236844\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5338983050847458,\n\
\ \"acc_stderr\": 0.012740853872949839,\n \"acc_norm\": 0.5338983050847458,\n\
\ \"acc_norm_stderr\": 0.012740853872949839\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7132352941176471,\n \"acc_stderr\": 0.027472274473233818,\n\
\ \"acc_norm\": 0.7132352941176471,\n \"acc_norm_stderr\": 0.027472274473233818\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7483660130718954,\n \"acc_stderr\": 0.01755581809132227,\n \
\ \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.01755581809132227\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.02783302387139968,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.02783302387139968\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n\
\ \"acc_stderr\": 0.024112678240900826,\n \"acc_norm\": 0.8656716417910447,\n\
\ \"acc_norm_stderr\": 0.024112678240900826\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.02876234912646613,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.02876234912646613\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061445,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061445\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.41370869033047736,\n\
\ \"mc1_stderr\": 0.0172408618120998,\n \"mc2\": 0.5644074616941972,\n\
\ \"mc2_stderr\": 0.015397066221595713\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7987371744277821,\n \"acc_stderr\": 0.011268519971577684\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.604245640636846,\n \
\ \"acc_stderr\": 0.013469823701048815\n }\n}\n```"
repo_url: https://huggingface.co/AA051610/A11P
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|arc:challenge|25_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|gsm8k|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hellaswag|10_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-11T02-59-53.573351.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-11T02-59-53.573351.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- '**/details_harness|winogrande|5_2023-12-11T02-59-53.573351.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-11T02-59-53.573351.parquet'
- config_name: results
data_files:
- split: 2023_12_11T02_59_53.573351
path:
- results_2023-12-11T02-59-53.573351.parquet
- split: latest
path:
- results_2023-12-11T02-59-53.573351.parquet
---
# Dataset Card for Evaluation run of AA051610/A11P
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/AA051610/A11P
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [AA051610/A11P](https://huggingface.co/AA051610/A11P) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051610__A11P",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-11T02:59:53.573351](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__A11P/blob/main/results_2023-12-11T02-59-53.573351.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7024107525999363,
"acc_stderr": 0.030362293861859797,
"acc_norm": 0.7062972608094896,
"acc_norm_stderr": 0.030951825247607496,
"mc1": 0.41370869033047736,
"mc1_stderr": 0.0172408618120998,
"mc2": 0.5644074616941972,
"mc2_stderr": 0.015397066221595713
},
"harness|arc:challenge|25": {
"acc": 0.6040955631399317,
"acc_stderr": 0.014291228393536587,
"acc_norm": 0.6254266211604096,
"acc_norm_stderr": 0.014144193471893449
},
"harness|hellaswag|10": {
"acc": 0.6191993626767576,
"acc_stderr": 0.004845912857338663,
"acc_norm": 0.8253335988846843,
"acc_norm_stderr": 0.003789055487003176
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8355263157894737,
"acc_stderr": 0.030167533468632726,
"acc_norm": 0.8355263157894737,
"acc_norm_stderr": 0.030167533468632726
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7358490566037735,
"acc_stderr": 0.027134291628741706,
"acc_norm": 0.7358490566037735,
"acc_norm_stderr": 0.027134291628741706
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7916666666666666,
"acc_stderr": 0.03396116205845335,
"acc_norm": 0.7916666666666666,
"acc_norm_stderr": 0.03396116205845335
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.8,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7063829787234043,
"acc_stderr": 0.02977164271249123,
"acc_norm": 0.7063829787234043,
"acc_norm_stderr": 0.02977164271249123
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7310344827586207,
"acc_stderr": 0.036951833116502325,
"acc_norm": 0.7310344827586207,
"acc_norm_stderr": 0.036951833116502325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5476190476190477,
"acc_stderr": 0.025634258115554955,
"acc_norm": 0.5476190476190477,
"acc_norm_stderr": 0.025634258115554955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8548387096774194,
"acc_stderr": 0.02003956362805328,
"acc_norm": 0.8548387096774194,
"acc_norm_stderr": 0.02003956362805328
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.034991131376767445,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.034991131376767445
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656208,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656208
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.02325315795194208,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.02325315795194208
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9222797927461139,
"acc_stderr": 0.019321805557223144,
"acc_norm": 0.9222797927461139,
"acc_norm_stderr": 0.019321805557223144
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7692307692307693,
"acc_stderr": 0.02136202772522272,
"acc_norm": 0.7692307692307693,
"acc_norm_stderr": 0.02136202772522272
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251976,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251976
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7857142857142857,
"acc_stderr": 0.026653531596715484,
"acc_norm": 0.7857142857142857,
"acc_norm_stderr": 0.026653531596715484
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.40397350993377484,
"acc_stderr": 0.040064856853653415,
"acc_norm": 0.40397350993377484,
"acc_norm_stderr": 0.040064856853653415
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8825688073394495,
"acc_stderr": 0.01380278022737734,
"acc_norm": 0.8825688073394495,
"acc_norm_stderr": 0.01380278022737734
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8970588235294118,
"acc_stderr": 0.021328337570804365,
"acc_norm": 0.8970588235294118,
"acc_norm_stderr": 0.021328337570804365
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8734177215189873,
"acc_stderr": 0.021644195727955173,
"acc_norm": 0.8734177215189873,
"acc_norm_stderr": 0.021644195727955173
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7802690582959642,
"acc_stderr": 0.02779017706438359,
"acc_norm": 0.7802690582959642,
"acc_norm_stderr": 0.02779017706438359
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8549618320610687,
"acc_stderr": 0.03088466108951539,
"acc_norm": 0.8549618320610687,
"acc_norm_stderr": 0.03088466108951539
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8429752066115702,
"acc_stderr": 0.03321244842547129,
"acc_norm": 0.8429752066115702,
"acc_norm_stderr": 0.03321244842547129
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8703703703703703,
"acc_stderr": 0.032472243899179465,
"acc_norm": 0.8703703703703703,
"acc_norm_stderr": 0.032472243899179465
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8282208588957055,
"acc_stderr": 0.029634717272371047,
"acc_norm": 0.8282208588957055,
"acc_norm_stderr": 0.029634717272371047
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.04616143075028546,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.04616143075028546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.0376017800602662,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.0376017800602662
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.905982905982906,
"acc_stderr": 0.019119892798924978,
"acc_norm": 0.905982905982906,
"acc_norm_stderr": 0.019119892798924978
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8850574712643678,
"acc_stderr": 0.01140572072459397,
"acc_norm": 0.8850574712643678,
"acc_norm_stderr": 0.01140572072459397
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7774566473988439,
"acc_stderr": 0.02239421566194282,
"acc_norm": 0.7774566473988439,
"acc_norm_stderr": 0.02239421566194282
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42793296089385474,
"acc_stderr": 0.01654788799741611,
"acc_norm": 0.42793296089385474,
"acc_norm_stderr": 0.01654788799741611
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7810457516339869,
"acc_stderr": 0.02367908986180772,
"acc_norm": 0.7810457516339869,
"acc_norm_stderr": 0.02367908986180772
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7717041800643086,
"acc_stderr": 0.02383930331139819,
"acc_norm": 0.7717041800643086,
"acc_norm_stderr": 0.02383930331139819
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7808641975308642,
"acc_stderr": 0.023016705640262185,
"acc_norm": 0.7808641975308642,
"acc_norm_stderr": 0.023016705640262185
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5425531914893617,
"acc_stderr": 0.029719281272236844,
"acc_norm": 0.5425531914893617,
"acc_norm_stderr": 0.029719281272236844
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5338983050847458,
"acc_stderr": 0.012740853872949839,
"acc_norm": 0.5338983050847458,
"acc_norm_stderr": 0.012740853872949839
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7132352941176471,
"acc_stderr": 0.027472274473233818,
"acc_norm": 0.7132352941176471,
"acc_norm_stderr": 0.027472274473233818
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.01755581809132227,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.01755581809132227
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.02783302387139968,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.02783302387139968
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8656716417910447,
"acc_stderr": 0.024112678240900826,
"acc_norm": 0.8656716417910447,
"acc_norm_stderr": 0.024112678240900826
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.02876234912646613,
"acc_norm": 0.91,
"acc_norm_stderr": 0.02876234912646613
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.038695433234721015,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.038695433234721015
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061445,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061445
},
"harness|truthfulqa:mc|0": {
"mc1": 0.41370869033047736,
"mc1_stderr": 0.0172408618120998,
"mc2": 0.5644074616941972,
"mc2_stderr": 0.015397066221595713
},
"harness|winogrande|5": {
"acc": 0.7987371744277821,
"acc_stderr": 0.011268519971577684
},
"harness|gsm8k|5": {
"acc": 0.604245640636846,
"acc_stderr": 0.013469823701048815
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
kgr123/quality_counter_2000 | ---
dataset_info:
features:
- name: context
dtype: string
- name: word
dtype: string
- name: claim
dtype: string
- name: label
dtype: int64
splits:
- name: test
num_bytes: 11265605
num_examples: 1929
- name: train
num_bytes: 11155926
num_examples: 1935
- name: validation
num_bytes: 11367894
num_examples: 1941
download_size: 7642748
dataset_size: 33789425
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
mstz/fairbelief | ---
license: cc-by-sa-4.0
---
|
distilled-from-one-sec-cv12/chunk_17 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1420681296
num_examples: 276828
download_size: 1451360616
dataset_size: 1420681296
---
# Dataset Card for "chunk_17"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TIGER-Lab/Subject_Driven_Image_Editing | ---
dataset_info:
features:
- name: uid
dtype: int64
- name: image
dtype: image
- name: subject
dtype: string
- name: subject_image_0
dtype: image
- name: subject_image_1
dtype: image
- name: subject_image_2
dtype: image
splits:
- name: eval
num_bytes: 154799894.0
num_examples: 154
- name: extra
num_bytes: 66230300.0
num_examples: 66
download_size: 49158277
dataset_size: 221030194.0
---
# Dataset Card for "Subject_Driven_Image_Editing"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Doutran/myllinhaset | ---
license: openrail
---
|
open-llm-leaderboard/details_BarryFutureman__ChatMarc-YesAnotherMerge-7B | ---
pretty_name: Evaluation run of BarryFutureman/ChatMarc-YesAnotherMerge-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BarryFutureman/ChatMarc-YesAnotherMerge-7B](https://huggingface.co/BarryFutureman/ChatMarc-YesAnotherMerge-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BarryFutureman__ChatMarc-YesAnotherMerge-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-24T05:34:18.479696](https://huggingface.co/datasets/open-llm-leaderboard/details_BarryFutureman__ChatMarc-YesAnotherMerge-7B/blob/main/results_2024-01-24T05-34-18.479696.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6560146863596611,\n\
\ \"acc_stderr\": 0.03206476346441959,\n \"acc_norm\": 0.6553348227714214,\n\
\ \"acc_norm_stderr\": 0.03273507303552595,\n \"mc1\": 0.5618115055079559,\n\
\ \"mc1_stderr\": 0.017369236164404406,\n \"mc2\": 0.700398174715358,\n\
\ \"mc2_stderr\": 0.015160702701664436\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7022184300341296,\n \"acc_stderr\": 0.013363080107244484,\n\
\ \"acc_norm\": 0.7278156996587031,\n \"acc_norm_stderr\": 0.013006600406423702\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7228639713204541,\n\
\ \"acc_stderr\": 0.004466695023677836,\n \"acc_norm\": 0.8838876717785302,\n\
\ \"acc_norm_stderr\": 0.003197048476003638\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n\
\ \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \
\ \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700914,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700914\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.023904914311782655,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.023904914311782655\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6794871794871795,\n \"acc_stderr\": 0.02366129639396428,\n \
\ \"acc_norm\": 0.6794871794871795,\n \"acc_norm_stderr\": 0.02366129639396428\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.03068473711513536,\n \
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.03068473711513536\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590172,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590172\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.046695106638751906,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.046695106638751906\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n\
\ \"acc_stderr\": 0.0133064782430663,\n \"acc_norm\": 0.8339719029374202,\n\
\ \"acc_norm_stderr\": 0.0133064782430663\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069363,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069363\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4335195530726257,\n\
\ \"acc_stderr\": 0.016574027219517635,\n \"acc_norm\": 0.4335195530726257,\n\
\ \"acc_norm_stderr\": 0.016574027219517635\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042103,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042103\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5141843971631206,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.5141843971631206,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n\
\ \"acc_stderr\": 0.012749206007657476,\n \"acc_norm\": 0.47131681877444587,\n\
\ \"acc_norm_stderr\": 0.012749206007657476\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \
\ \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399677,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5618115055079559,\n\
\ \"mc1_stderr\": 0.017369236164404406,\n \"mc2\": 0.700398174715358,\n\
\ \"mc2_stderr\": 0.015160702701664436\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8389897395422258,\n \"acc_stderr\": 0.010329712832785722\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6997725549658832,\n \
\ \"acc_stderr\": 0.012625423152283034\n }\n}\n```"
repo_url: https://huggingface.co/BarryFutureman/ChatMarc-YesAnotherMerge-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|arc:challenge|25_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|gsm8k|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hellaswag|10_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T05-34-18.479696.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-24T05-34-18.479696.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- '**/details_harness|winogrande|5_2024-01-24T05-34-18.479696.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-24T05-34-18.479696.parquet'
- config_name: results
data_files:
- split: 2024_01_24T05_34_18.479696
path:
- results_2024-01-24T05-34-18.479696.parquet
- split: latest
path:
- results_2024-01-24T05-34-18.479696.parquet
---
# Dataset Card for Evaluation run of BarryFutureman/ChatMarc-YesAnotherMerge-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BarryFutureman/ChatMarc-YesAnotherMerge-7B](https://huggingface.co/BarryFutureman/ChatMarc-YesAnotherMerge-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BarryFutureman__ChatMarc-YesAnotherMerge-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-24T05:34:18.479696](https://huggingface.co/datasets/open-llm-leaderboard/details_BarryFutureman__ChatMarc-YesAnotherMerge-7B/blob/main/results_2024-01-24T05-34-18.479696.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6560146863596611,
"acc_stderr": 0.03206476346441959,
"acc_norm": 0.6553348227714214,
"acc_norm_stderr": 0.03273507303552595,
"mc1": 0.5618115055079559,
"mc1_stderr": 0.017369236164404406,
"mc2": 0.700398174715358,
"mc2_stderr": 0.015160702701664436
},
"harness|arc:challenge|25": {
"acc": 0.7022184300341296,
"acc_stderr": 0.013363080107244484,
"acc_norm": 0.7278156996587031,
"acc_norm_stderr": 0.013006600406423702
},
"harness|hellaswag|10": {
"acc": 0.7228639713204541,
"acc_stderr": 0.004466695023677836,
"acc_norm": 0.8838876717785302,
"acc_norm_stderr": 0.003197048476003638
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700914,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700914
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440678,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440678
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782655,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782655
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.02366129639396428,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.02366129639396428
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.03068473711513536,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.03068473711513536
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590172,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590172
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.046695106638751906,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.046695106638751906
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8339719029374202,
"acc_stderr": 0.0133064782430663,
"acc_norm": 0.8339719029374202,
"acc_norm_stderr": 0.0133064782430663
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069363,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069363
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4335195530726257,
"acc_stderr": 0.016574027219517635,
"acc_norm": 0.4335195530726257,
"acc_norm_stderr": 0.016574027219517635
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984813,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042103,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042103
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5141843971631206,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.5141843971631206,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47131681877444587,
"acc_stderr": 0.012749206007657476,
"acc_norm": 0.47131681877444587,
"acc_norm_stderr": 0.012749206007657476
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396556,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396556
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399677,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5618115055079559,
"mc1_stderr": 0.017369236164404406,
"mc2": 0.700398174715358,
"mc2_stderr": 0.015160702701664436
},
"harness|winogrande|5": {
"acc": 0.8389897395422258,
"acc_stderr": 0.010329712832785722
},
"harness|gsm8k|5": {
"acc": 0.6997725549658832,
"acc_stderr": 0.012625423152283034
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
coastalcph/fm-updates-llama2-chat-7b | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query
struct:
- name: label
dtype: string
- name: objects
list:
- name: aliases
sequence: string
- name: label
dtype: string
- name: qid
dtype: string
- name: qid
dtype: string
- name: rel_id
dtype: string
- name: relation
dtype: string
- name: prediction
struct:
- name: predictions
list:
- name: answer
dtype: string
- name: first_token_probability
dtype: float64
- name: per_token_probability
sequence: float64
- name: perplexity
dtype: float64
- name: query
dtype: string
- name: f1
dtype: float64
- name: relation
dtype: string
- name: type
dtype: string
- name: original_answer
dtype: string
- name: updates
sequence: string
splits:
- name: test
num_bytes: 2983210.8077126252
num_examples: 6414
download_size: 1236982
dataset_size: 2983210.8077126252
---
# Dataset Card for "fm-updates-llama2-chat-7b"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
intanm/indonesian-financial-topic-classification-dataset | ---
license: apache-2.0
task_categories:
- text-classification
language:
- id
tags:
- finance
size_categories:
- 10K<n<100K
---
Translated version of https://huggingface.co/datasets/zeroshot/twitter-financial-news-topic
topics = {
"LABEL_0": "Analyst Update",
"LABEL_1": "Fed | Central Banks",
"LABEL_2": "Company | Product News",
"LABEL_3": "Treasuries | Corporate Debt",
"LABEL_4": "Dividend",
"LABEL_5": "Earnings",
"LABEL_6": "Energy | Oil",
"LABEL_7": "Financials",
"LABEL_8": "Currencies",
"LABEL_9": "General News | Opinion",
"LABEL_10": "Gold | Metals | Materials",
"LABEL_11": "IPO",
"LABEL_12": "Legal | Regulation",
"LABEL_13": "M&A | Investments",
"LABEL_14": "Macro",
"LABEL_15": "Markets",
"LABEL_16": "Politics",
"LABEL_17": "Personnel Change",
"LABEL_18": "Stock Commentary",
"LABEL_19": "Stock Movement",
}
|
argilla/ultrafeedback_binarized_full | ---
dataset_info:
features:
- name: source
dtype: string
- name: instruction
dtype: string
- name: best_response
struct:
- name: annotations
struct:
- name: helpfulness
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: Rationale For Rating
dtype: string
- name: Type
sequence: string
- name: honesty
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: instruction_following
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: truthfulness
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: Rationale For Rating
dtype: string
- name: Type
sequence: string
- name: critique
dtype: string
- name: custom_system_prompt
dtype: string
- name: model
dtype: string
- name: overall_score
dtype: float64
- name: principle
dtype: string
- name: response
dtype: string
- name: best_model
dtype: string
- name: best_score
dtype: float64
- name: random_response
struct:
- name: annotations
struct:
- name: helpfulness
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: Rationale For Rating
dtype: string
- name: Type
sequence: string
- name: honesty
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: instruction_following
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: truthfulness
struct:
- name: Rating
dtype: string
- name: Rationale
dtype: string
- name: Rationale For Rating
dtype: string
- name: Type
sequence: string
- name: critique
dtype: string
- name: custom_system_prompt
dtype: string
- name: model
dtype: string
- name: overall_score
dtype: float64
- name: principle
dtype: string
- name: response
dtype: string
- name: random_model
dtype: string
- name: random_score
dtype: float64
- name: correct_answers
sequence: string
- name: incorrect_answers
sequence: string
splits:
- name: train
num_bytes: 447221757
num_examples: 63967
download_size: 199896433
dataset_size: 447221757
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ultrafeedback_binarized_full"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yangyz1230/H3K79me3_not_filtered | ---
dataset_info:
features:
- name: name
dtype: string
- name: sequence
dtype: string
- name: chrom
dtype: string
- name: start
dtype: int64
- name: end
dtype: int64
- name: strand
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 439373
num_examples: 799
- name: test
num_bytes: 44840
num_examples: 82
download_size: 232580
dataset_size: 484213
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
irds/beir_msmarco | ---
pretty_name: '`beir/msmarco`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `beir/msmarco`
The `beir/msmarco` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/beir#beir/msmarco).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=8,841,823
- `queries` (i.e., topics); count=509,962
This dataset is used by: [`beir_msmarco_dev`](https://huggingface.co/datasets/irds/beir_msmarco_dev), [`beir_msmarco_test`](https://huggingface.co/datasets/irds/beir_msmarco_test), [`beir_msmarco_train`](https://huggingface.co/datasets/irds/beir_msmarco_train)
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/beir_msmarco', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ...}
queries = load_dataset('irds/beir_msmarco', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Bajaj2016Msmarco,
title={MS MARCO: A Human Generated MAchine Reading COmprehension Dataset},
author={Payal Bajaj, Daniel Campos, Nick Craswell, Li Deng, Jianfeng Gao, Xiaodong Liu, Rangan Majumder, Andrew McNamara, Bhaskar Mitra, Tri Nguyen, Mir Rosenberg, Xia Song, Alina Stoica, Saurabh Tiwary, Tong Wang},
booktitle={InCoCo@NIPS},
year={2016}
}
@article{Thakur2021Beir,
title = "BEIR: A Heterogenous Benchmark for Zero-shot Evaluation of Information Retrieval Models",
author = "Thakur, Nandan and Reimers, Nils and Rücklé, Andreas and Srivastava, Abhishek and Gurevych, Iryna",
journal= "arXiv preprint arXiv:2104.08663",
month = "4",
year = "2021",
url = "https://arxiv.org/abs/2104.08663",
}
```
|
ConvLab/sgd2 | ---
language:
- en
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
pretty_name: SGD-X v2
size_categories:
- 10K<n<100K
task_categories:
- conversational
---
# Dataset Card for SGD-X v2
- **Repository:** https://github.com/google-research-datasets/dstc8-schema-guided-dialogue/tree/master/sgd_x
- **Paper:** https://arxiv.org/pdf/2110.06800.pdf
- **Leaderboard:** None
- **Who transforms the dataset:** Qi Zhu(zhuq96 at gmail dot com)
To use this dataset, you need to install [ConvLab-3](https://github.com/ConvLab/ConvLab-3) platform first. Then you can load the dataset via:
```
from convlab.util import load_dataset, load_ontology, load_database
dataset = load_dataset('sgd2')
ontology = load_ontology('sgd2')
database = load_database('sgd2')
```
For more usage please refer to [here](https://github.com/ConvLab/ConvLab-3/tree/master/data/unified_datasets).
### Dataset Summary
The **Schema-Guided Dialogue (SGD)** dataset consists of over 20k annotated multi-domain, task-oriented conversations between a human and a virtual assistant. These conversations involve interactions with services and APIs spanning 20 domains, such as banks, events, media, calendar, travel, and weather. For most of these domains, the dataset contains multiple different APIs, many of which have overlapping functionalities but different interfaces, which reflects common real-world scenarios. The wide range of available annotations can be used for intent prediction, slot filling, dialogue state tracking, policy imitation learning, language generation, and user simulation learning, among other tasks for developing large-scale virtual assistants. Additionally, the dataset contains unseen domains and services in the evaluation set to quantify the performance in zero-shot or few-shot settings.
The **SGD-X** dataset consists of 5 linguistic variants of every schema in the original SGD dataset. Linguistic variants were written by hundreds of paid crowd-workers. In the SGD-X directory, v1 represents the variant closest to the original schemas and v5 the farthest in terms of linguistic distance. To evaluate model performance on SGD-X schemas, dialogues must be converted using the script generate_sgdx_dialogues.py.
- **How to get the transformed data from original data:**
- Download [dstc8-schema-guided-dialogue-master.zip](https://github.com/google-research-datasets/dstc8-schema-guided-dialogue/archive/refs/heads/master.zip).
- Modified `sgd_x/generate_sgdx_dialogues.py` as https://github.com/google-research-datasets/dstc8-schema-guided-dialogue/issues/57
- Run `python -m sgd_x.generate_sgdx_dialogues` under `dstc8-schema-guided-dialogue-master` dir which need tensorflow installed.
- Run `python preprocess.py` in the current directory.
- **Main changes of the transformation:**
- Lower case original `act` as `intent`.
- Add `count` slot for each domain, non-categorical, find span by text matching.
- Categorize `dialogue acts` according to the `intent`.
- Concatenate multiple values using `|`.
- Retain `active_intent`, `requested_slots`, `service_call`.
- **Annotations:**
- dialogue acts, state, db_results, service_call, active_intent, requested_slots.
### Supported Tasks and Leaderboards
NLU, DST, Policy, NLG, E2E
### Languages
English
### Data Splits
| split | dialogues | utterances | avg_utt | avg_tokens | avg_domains | cat slot match(state) | cat slot match(goal) | cat slot match(dialogue act) | non-cat slot span(dialogue act) |
|------------|-------------|--------------|-----------|--------------|---------------|-------------------------|------------------------|--------------------------------|-----------------------------------|
| train | 16142 | 329964 | 20.44 | 9.75 | 1.84 | 100 | - | 100 | 100 |
| validation | 2482 | 48726 | 19.63 | 9.66 | 1.84 | 100 | - | 100 | 100 |
| test | 4201 | 84594 | 20.14 | 10.4 | 2.02 | 100 | - | 100 | 100 |
| all | 22825 | 463284 | 20.3 | 9.86 | 1.87 | 100 | - | 100 | 100 |
45 domains: ['Banks_12', 'Buses_12', 'Buses_22', 'Calendar_12', 'Events_12', 'Events_22', 'Flights_12', 'Flights_22', 'Homes_12', 'Hotels_12', 'Hotels_22', 'Hotels_32', 'Media_12', 'Movies_12', 'Music_12', 'Music_22', 'RentalCars_12', 'RentalCars_22', 'Restaurants_12', 'RideSharing_12', 'RideSharing_22', 'Services_12', 'Services_22', 'Services_32', 'Travel_12', 'Weather_12', 'Alarm_12', 'Banks_22', 'Flights_32', 'Hotels_42', 'Media_22', 'Movies_22', 'Restaurants_22', 'Services_42', 'Buses_32', 'Events_32', 'Flights_42', 'Homes_22', 'Media_32', 'Messaging_12', 'Movies_32', 'Music_32', 'Payment_12', 'RentalCars_32', 'Trains_12']
- **cat slot match**: how many values of categorical slots are in the possible values of ontology in percentage.
- **non-cat slot span**: how many values of non-categorical slots have span annotation in percentage.
### Citation
```
@inproceedings{lee2022sgd,
title={SGD-X: A Benchmark for Robust Generalization in Schema-Guided Dialogue Systems},
author={Lee, Harrison and Gupta, Raghav and Rastogi, Abhinav and Cao, Yuan and Zhang, Bin and Wu, Yonghui},
booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
volume={36},
number={10},
pages={10938--10946},
year={2022}
}
```
### Licensing Information
[**CC BY-SA 4.0**](https://creativecommons.org/licenses/by-sa/4.0/) |
g-ronimo/oasst2_top1_en_answers-mixtral | ---
dataset_info:
features:
- name: conversation
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 14166584
num_examples: 5419
download_size: 7059605
dataset_size: 14166584
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
tags:
- synthetic
---
* Top 1% conversations of https://huggingface.co/datasets/OpenAssistant/oasst2
* language-filtered: en
* generated using https://github.com/blancsw/deep_4_all/blob/main/datasets/oasst/convert.py
* assistant answers replaced with answers by [Mixtral-8x7B](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1)
* _Note_: This is an unfiltered dataset, it for sure contains very bad answers. |
distilled-from-one-sec-cv12/chunk_79 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1303912900
num_examples: 254075
download_size: 1332936637
dataset_size: 1303912900
---
# Dataset Card for "chunk_79"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
amitness/logits-maltese-512 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
- name: teacher_logits
sequence:
sequence: float64
- name: teacher_indices
sequence:
sequence: int64
- name: teacher_mask_indices
sequence: int64
splits:
- name: train
num_bytes: 230200052
num_examples: 12655
download_size: 84312982
dataset_size: 230200052
---
# Dataset Card for "logits-maltese-512"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Vishwanath0912/qa_en_hi | ---
license: mit
---
|
sumitpardhiya/Face-Mask-Detection | ---
license: apache-2.0
---
The dataset contains three folders: Train, Test, and Validation. Each of these folders includes two subfolders—one for mask images and another for images without masks. |
EleutherAI/quirky_modularaddition_increment0_bob_hard | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: difficulty
dtype: int64
- name: statement
dtype: string
- name: choices
sequence: string
- name: character
dtype: string
- name: label
dtype: bool
splits:
- name: train
num_bytes: 3563112.95803125
num_examples: 48087
- name: validation
num_bytes: 75436.0905
num_examples: 1018
- name: test
num_bytes: 73418.235
num_examples: 991
download_size: 1104505
dataset_size: 3711967.28353125
---
# Dataset Card for "quirky_modularaddition_increment0_bob_hard"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kenilshah35/dictation-test | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 8894736.0
num_examples: 19
download_size: 4493848
dataset_size: 8894736.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
StanBienaives/jade-considerants | ---
language:
- fr
--- |
open-llm-leaderboard/details_vistagi__Mixtral-8x7b-v0.1-dpo | ---
pretty_name: Evaluation run of vistagi/Mixtral-8x7b-v0.1-dpo
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [vistagi/Mixtral-8x7b-v0.1-dpo](https://huggingface.co/vistagi/Mixtral-8x7b-v0.1-dpo)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vistagi__Mixtral-8x7b-v0.1-dpo\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-18T06:28:07.647994](https://huggingface.co/datasets/open-llm-leaderboard/details_vistagi__Mixtral-8x7b-v0.1-dpo/blob/main/results_2024-02-18T06-28-07.647994.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7134552339034452,\n\
\ \"acc_stderr\": 0.030055997546363594,\n \"acc_norm\": 0.7181597948300631,\n\
\ \"acc_norm_stderr\": 0.030631631253278484,\n \"mc1\": 0.31456548347613217,\n\
\ \"mc1_stderr\": 0.016255241993179185,\n \"mc2\": 0.4674384125733044,\n\
\ \"mc2_stderr\": 0.01414272854245227\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6322525597269625,\n \"acc_stderr\": 0.01409099561816849,\n\
\ \"acc_norm\": 0.6655290102389079,\n \"acc_norm_stderr\": 0.013787460322441374\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6694881497709619,\n\
\ \"acc_stderr\": 0.004694360968929403,\n \"acc_norm\": 0.8639713204540929,\n\
\ \"acc_norm_stderr\": 0.0034211839093201673\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6962962962962963,\n\
\ \"acc_stderr\": 0.03972552884785137,\n \"acc_norm\": 0.6962962962962963,\n\
\ \"acc_norm_stderr\": 0.03972552884785137\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8223684210526315,\n \"acc_stderr\": 0.031103182383123387,\n\
\ \"acc_norm\": 0.8223684210526315,\n \"acc_norm_stderr\": 0.031103182383123387\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7886792452830189,\n \"acc_stderr\": 0.025125766484827845,\n\
\ \"acc_norm\": 0.7886792452830189,\n \"acc_norm_stderr\": 0.025125766484827845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8472222222222222,\n\
\ \"acc_stderr\": 0.030085743248565666,\n \"acc_norm\": 0.8472222222222222,\n\
\ \"acc_norm_stderr\": 0.030085743248565666\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7052023121387283,\n\
\ \"acc_stderr\": 0.03476599607516478,\n \"acc_norm\": 0.7052023121387283,\n\
\ \"acc_norm_stderr\": 0.03476599607516478\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.82,\n \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\": 0.82,\n\
\ \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6808510638297872,\n \"acc_stderr\": 0.03047297336338004,\n\
\ \"acc_norm\": 0.6808510638297872,\n \"acc_norm_stderr\": 0.03047297336338004\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6491228070175439,\n\
\ \"acc_stderr\": 0.04489539350270698,\n \"acc_norm\": 0.6491228070175439,\n\
\ \"acc_norm_stderr\": 0.04489539350270698\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6551724137931034,\n \"acc_stderr\": 0.03960933549451208,\n\
\ \"acc_norm\": 0.6551724137931034,\n \"acc_norm_stderr\": 0.03960933549451208\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.47619047619047616,\n \"acc_stderr\": 0.025722097064388525,\n \"\
acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.025722097064388525\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5634920634920635,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.5634920634920635,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8354838709677419,\n \"acc_stderr\": 0.021090847745939313,\n \"\
acc_norm\": 0.8354838709677419,\n \"acc_norm_stderr\": 0.021090847745939313\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.6354679802955665,\n \"acc_stderr\": 0.0338640574606209,\n \"acc_norm\"\
: 0.6354679802955665,\n \"acc_norm_stderr\": 0.0338640574606209\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8242424242424242,\n \"acc_stderr\": 0.02972094300622445,\n\
\ \"acc_norm\": 0.8242424242424242,\n \"acc_norm_stderr\": 0.02972094300622445\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822516,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822516\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240524,\n\
\ \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240524\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7128205128205128,\n \"acc_stderr\": 0.022939925418530613,\n\
\ \"acc_norm\": 0.7128205128205128,\n \"acc_norm_stderr\": 0.022939925418530613\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \
\ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7899159663865546,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.7899159663865546,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4900662251655629,\n \"acc_stderr\": 0.04081677107248437,\n \"\
acc_norm\": 0.4900662251655629,\n \"acc_norm_stderr\": 0.04081677107248437\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8844036697247707,\n \"acc_stderr\": 0.01370874953417264,\n \"\
acc_norm\": 0.8844036697247707,\n \"acc_norm_stderr\": 0.01370874953417264\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6342592592592593,\n \"acc_stderr\": 0.03284738857647206,\n \"\
acc_norm\": 0.6342592592592593,\n \"acc_norm_stderr\": 0.03284738857647206\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.024509803921568624,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.024509803921568624\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8860759493670886,\n \"acc_stderr\": 0.020681745135884562,\n \
\ \"acc_norm\": 0.8860759493670886,\n \"acc_norm_stderr\": 0.020681745135884562\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7757847533632287,\n\
\ \"acc_stderr\": 0.027991534258519517,\n \"acc_norm\": 0.7757847533632287,\n\
\ \"acc_norm_stderr\": 0.027991534258519517\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494732,\n\
\ \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494732\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.03008309871603521,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.03008309871603521\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5625,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.5625,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.033932957297610096,\n\
\ \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.033932957297610096\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n\
\ \"acc_stderr\": 0.018315891685625852,\n \"acc_norm\": 0.9145299145299145,\n\
\ \"acc_norm_stderr\": 0.018315891685625852\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8735632183908046,\n\
\ \"acc_stderr\": 0.011884488905895555,\n \"acc_norm\": 0.8735632183908046,\n\
\ \"acc_norm_stderr\": 0.011884488905895555\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8034682080924855,\n \"acc_stderr\": 0.021393961404363844,\n\
\ \"acc_norm\": 0.8034682080924855,\n \"acc_norm_stderr\": 0.021393961404363844\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40558659217877097,\n\
\ \"acc_stderr\": 0.016421670506339175,\n \"acc_norm\": 0.40558659217877097,\n\
\ \"acc_norm_stderr\": 0.016421670506339175\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8202614379084967,\n \"acc_stderr\": 0.02198603218206415,\n\
\ \"acc_norm\": 0.8202614379084967,\n \"acc_norm_stderr\": 0.02198603218206415\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7845659163987139,\n\
\ \"acc_stderr\": 0.023350225475471442,\n \"acc_norm\": 0.7845659163987139,\n\
\ \"acc_norm_stderr\": 0.023350225475471442\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8487654320987654,\n \"acc_stderr\": 0.019935086092149883,\n\
\ \"acc_norm\": 0.8487654320987654,\n \"acc_norm_stderr\": 0.019935086092149883\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5283687943262412,\n \"acc_stderr\": 0.02977945095730305,\n \
\ \"acc_norm\": 0.5283687943262412,\n \"acc_norm_stderr\": 0.02977945095730305\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5325945241199479,\n\
\ \"acc_stderr\": 0.012743072942653368,\n \"acc_norm\": 0.5325945241199479,\n\
\ \"acc_norm_stderr\": 0.012743072942653368\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8014705882352942,\n \"acc_stderr\": 0.024231013370541087,\n\
\ \"acc_norm\": 0.8014705882352942,\n \"acc_norm_stderr\": 0.024231013370541087\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7826797385620915,\n \"acc_stderr\": 0.016684820929148587,\n \
\ \"acc_norm\": 0.7826797385620915,\n \"acc_norm_stderr\": 0.016684820929148587\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7918367346938775,\n \"acc_stderr\": 0.025991117672813292,\n\
\ \"acc_norm\": 0.7918367346938775,\n \"acc_norm_stderr\": 0.025991117672813292\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8805970149253731,\n\
\ \"acc_stderr\": 0.02292879327721974,\n \"acc_norm\": 0.8805970149253731,\n\
\ \"acc_norm_stderr\": 0.02292879327721974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.93,\n \"acc_stderr\": 0.0256432399976243,\n \
\ \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.0256432399976243\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015575,\n\
\ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015575\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31456548347613217,\n\
\ \"mc1_stderr\": 0.016255241993179185,\n \"mc2\": 0.4674384125733044,\n\
\ \"mc2_stderr\": 0.01414272854245227\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8153117600631413,\n \"acc_stderr\": 0.010905978112156885\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5617892342683851,\n \
\ \"acc_stderr\": 0.013666915917255069\n }\n}\n```"
repo_url: https://huggingface.co/vistagi/Mixtral-8x7b-v0.1-dpo
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|arc:challenge|25_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|gsm8k|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hellaswag|10_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T06-28-07.647994.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T06-28-07.647994.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- '**/details_harness|winogrande|5_2024-02-18T06-28-07.647994.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-18T06-28-07.647994.parquet'
- config_name: results
data_files:
- split: 2024_02_18T06_28_07.647994
path:
- results_2024-02-18T06-28-07.647994.parquet
- split: latest
path:
- results_2024-02-18T06-28-07.647994.parquet
---
# Dataset Card for Evaluation run of vistagi/Mixtral-8x7b-v0.1-dpo
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [vistagi/Mixtral-8x7b-v0.1-dpo](https://huggingface.co/vistagi/Mixtral-8x7b-v0.1-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vistagi__Mixtral-8x7b-v0.1-dpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-18T06:28:07.647994](https://huggingface.co/datasets/open-llm-leaderboard/details_vistagi__Mixtral-8x7b-v0.1-dpo/blob/main/results_2024-02-18T06-28-07.647994.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7134552339034452,
"acc_stderr": 0.030055997546363594,
"acc_norm": 0.7181597948300631,
"acc_norm_stderr": 0.030631631253278484,
"mc1": 0.31456548347613217,
"mc1_stderr": 0.016255241993179185,
"mc2": 0.4674384125733044,
"mc2_stderr": 0.01414272854245227
},
"harness|arc:challenge|25": {
"acc": 0.6322525597269625,
"acc_stderr": 0.01409099561816849,
"acc_norm": 0.6655290102389079,
"acc_norm_stderr": 0.013787460322441374
},
"harness|hellaswag|10": {
"acc": 0.6694881497709619,
"acc_stderr": 0.004694360968929403,
"acc_norm": 0.8639713204540929,
"acc_norm_stderr": 0.0034211839093201673
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6962962962962963,
"acc_stderr": 0.03972552884785137,
"acc_norm": 0.6962962962962963,
"acc_norm_stderr": 0.03972552884785137
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8223684210526315,
"acc_stderr": 0.031103182383123387,
"acc_norm": 0.8223684210526315,
"acc_norm_stderr": 0.031103182383123387
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7886792452830189,
"acc_stderr": 0.025125766484827845,
"acc_norm": 0.7886792452830189,
"acc_norm_stderr": 0.025125766484827845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8472222222222222,
"acc_stderr": 0.030085743248565666,
"acc_norm": 0.8472222222222222,
"acc_norm_stderr": 0.030085743248565666
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.03476599607516478,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.03476599607516478
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6808510638297872,
"acc_stderr": 0.03047297336338004,
"acc_norm": 0.6808510638297872,
"acc_norm_stderr": 0.03047297336338004
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6491228070175439,
"acc_stderr": 0.04489539350270698,
"acc_norm": 0.6491228070175439,
"acc_norm_stderr": 0.04489539350270698
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6551724137931034,
"acc_stderr": 0.03960933549451208,
"acc_norm": 0.6551724137931034,
"acc_norm_stderr": 0.03960933549451208
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.025722097064388525,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.025722097064388525
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5634920634920635,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.5634920634920635,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8354838709677419,
"acc_stderr": 0.021090847745939313,
"acc_norm": 0.8354838709677419,
"acc_norm_stderr": 0.021090847745939313
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6354679802955665,
"acc_stderr": 0.0338640574606209,
"acc_norm": 0.6354679802955665,
"acc_norm_stderr": 0.0338640574606209
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8242424242424242,
"acc_stderr": 0.02972094300622445,
"acc_norm": 0.8242424242424242,
"acc_norm_stderr": 0.02972094300622445
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822516,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822516
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.017426974154240524,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.017426974154240524
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7128205128205128,
"acc_stderr": 0.022939925418530613,
"acc_norm": 0.7128205128205128,
"acc_norm_stderr": 0.022939925418530613
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7899159663865546,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.7899159663865546,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4900662251655629,
"acc_stderr": 0.04081677107248437,
"acc_norm": 0.4900662251655629,
"acc_norm_stderr": 0.04081677107248437
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8844036697247707,
"acc_stderr": 0.01370874953417264,
"acc_norm": 0.8844036697247707,
"acc_norm_stderr": 0.01370874953417264
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6342592592592593,
"acc_stderr": 0.03284738857647206,
"acc_norm": 0.6342592592592593,
"acc_norm_stderr": 0.03284738857647206
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.024509803921568624,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.024509803921568624
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8860759493670886,
"acc_stderr": 0.020681745135884562,
"acc_norm": 0.8860759493670886,
"acc_norm_stderr": 0.020681745135884562
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7757847533632287,
"acc_stderr": 0.027991534258519517,
"acc_norm": 0.7757847533632287,
"acc_norm_stderr": 0.027991534258519517
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494732,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494732
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.03008309871603521,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.03008309871603521
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5625,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.8640776699029126,
"acc_stderr": 0.033932957297610096,
"acc_norm": 0.8640776699029126,
"acc_norm_stderr": 0.033932957297610096
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9145299145299145,
"acc_stderr": 0.018315891685625852,
"acc_norm": 0.9145299145299145,
"acc_norm_stderr": 0.018315891685625852
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8735632183908046,
"acc_stderr": 0.011884488905895555,
"acc_norm": 0.8735632183908046,
"acc_norm_stderr": 0.011884488905895555
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8034682080924855,
"acc_stderr": 0.021393961404363844,
"acc_norm": 0.8034682080924855,
"acc_norm_stderr": 0.021393961404363844
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40558659217877097,
"acc_stderr": 0.016421670506339175,
"acc_norm": 0.40558659217877097,
"acc_norm_stderr": 0.016421670506339175
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8202614379084967,
"acc_stderr": 0.02198603218206415,
"acc_norm": 0.8202614379084967,
"acc_norm_stderr": 0.02198603218206415
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7845659163987139,
"acc_stderr": 0.023350225475471442,
"acc_norm": 0.7845659163987139,
"acc_norm_stderr": 0.023350225475471442
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8487654320987654,
"acc_stderr": 0.019935086092149883,
"acc_norm": 0.8487654320987654,
"acc_norm_stderr": 0.019935086092149883
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5283687943262412,
"acc_stderr": 0.02977945095730305,
"acc_norm": 0.5283687943262412,
"acc_norm_stderr": 0.02977945095730305
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5325945241199479,
"acc_stderr": 0.012743072942653368,
"acc_norm": 0.5325945241199479,
"acc_norm_stderr": 0.012743072942653368
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8014705882352942,
"acc_stderr": 0.024231013370541087,
"acc_norm": 0.8014705882352942,
"acc_norm_stderr": 0.024231013370541087
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7826797385620915,
"acc_stderr": 0.016684820929148587,
"acc_norm": 0.7826797385620915,
"acc_norm_stderr": 0.016684820929148587
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7918367346938775,
"acc_stderr": 0.025991117672813292,
"acc_norm": 0.7918367346938775,
"acc_norm_stderr": 0.025991117672813292
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8805970149253731,
"acc_stderr": 0.02292879327721974,
"acc_norm": 0.8805970149253731,
"acc_norm_stderr": 0.02292879327721974
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.0256432399976243,
"acc_norm": 0.93,
"acc_norm_stderr": 0.0256432399976243
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015575,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015575
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31456548347613217,
"mc1_stderr": 0.016255241993179185,
"mc2": 0.4674384125733044,
"mc2_stderr": 0.01414272854245227
},
"harness|winogrande|5": {
"acc": 0.8153117600631413,
"acc_stderr": 0.010905978112156885
},
"harness|gsm8k|5": {
"acc": 0.5617892342683851,
"acc_stderr": 0.013666915917255069
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mtkinit/SuperDataset18293 | ---
pretty_name: SuperDataset18293
tags:
- uci
- world
---
# SuperDataset18293
Created from AIOD platform |
kalcho100/flippy_combined_dataset | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 925982000.8
num_examples: 677240
- name: test
num_bytes: 231495500.2
num_examples: 169310
download_size: 623484715
dataset_size: 1157477501.0
---
# Dataset Card for "flippy_combined_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_TurkuNLP__gpt3-finnish-13B | ---
pretty_name: Evaluation run of TurkuNLP/gpt3-finnish-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TurkuNLP/gpt3-finnish-13B](https://huggingface.co/TurkuNLP/gpt3-finnish-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TurkuNLP__gpt3-finnish-13B_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-08T08:19:11.789658](https://huggingface.co/datasets/open-llm-leaderboard/details_TurkuNLP__gpt3-finnish-13B_public/blob/main/results_2023-11-08T08-19-11.789658.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.010906040268456376,\n\
\ \"em_stderr\": 0.0010636334198497977,\n \"f1\": 0.0586136744966444,\n\
\ \"f1_stderr\": 0.001583703669300269,\n \"acc\": 0.29157154884622954,\n\
\ \"acc_stderr\": 0.007692758773767466\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.010906040268456376,\n \"em_stderr\": 0.0010636334198497977,\n\
\ \"f1\": 0.0586136744966444,\n \"f1_stderr\": 0.001583703669300269\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.003032600454890068,\n \
\ \"acc_stderr\": 0.0015145735612245414\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.580110497237569,\n \"acc_stderr\": 0.013870943986310391\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TurkuNLP/gpt3-finnish-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_11_08T08_19_11.789658
path:
- '**/details_harness|drop|3_2023-11-08T08-19-11.789658.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-08T08-19-11.789658.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_08T08_19_11.789658
path:
- '**/details_harness|gsm8k|5_2023-11-08T08-19-11.789658.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-08T08-19-11.789658.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_08T08_19_11.789658
path:
- '**/details_harness|winogrande|5_2023-11-08T08-19-11.789658.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-08T08-19-11.789658.parquet'
- config_name: results
data_files:
- split: 2023_11_08T08_19_11.789658
path:
- results_2023-11-08T08-19-11.789658.parquet
- split: latest
path:
- results_2023-11-08T08-19-11.789658.parquet
---
# Dataset Card for Evaluation run of TurkuNLP/gpt3-finnish-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TurkuNLP/gpt3-finnish-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TurkuNLP/gpt3-finnish-13B](https://huggingface.co/TurkuNLP/gpt3-finnish-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TurkuNLP__gpt3-finnish-13B_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-08T08:19:11.789658](https://huggingface.co/datasets/open-llm-leaderboard/details_TurkuNLP__gpt3-finnish-13B_public/blob/main/results_2023-11-08T08-19-11.789658.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.010906040268456376,
"em_stderr": 0.0010636334198497977,
"f1": 0.0586136744966444,
"f1_stderr": 0.001583703669300269,
"acc": 0.29157154884622954,
"acc_stderr": 0.007692758773767466
},
"harness|drop|3": {
"em": 0.010906040268456376,
"em_stderr": 0.0010636334198497977,
"f1": 0.0586136744966444,
"f1_stderr": 0.001583703669300269
},
"harness|gsm8k|5": {
"acc": 0.003032600454890068,
"acc_stderr": 0.0015145735612245414
},
"harness|winogrande|5": {
"acc": 0.580110497237569,
"acc_stderr": 0.013870943986310391
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mboth/waermeVerteilen-200-undersampled | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: ZweiteGrundfunktion
dtype: string
- name: label
dtype:
class_label:
names:
'0': Druckhaltestation
'1': HeizkreisAllgemein
'2': Heizkurve
'3': Kaeltemengenzaehler
'4': Pumpe
'5': Raum
'6': Regler
'7': Ruecklauf
'8': Uebertrager
'9': Ventil
'10': Vorlauf
'11': Waermemengenzaehler
'12': Warmwasserbereitung
splits:
- name: train
num_bytes: 407710.65048052603
num_examples: 1916
- name: test
num_bytes: 423002
num_examples: 1978
- name: valid
num_bytes: 423002
num_examples: 1978
download_size: 411048
dataset_size: 1253714.650480526
---
# Dataset Card for "waermeVerteilen-200-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rojpav/mini-croupier | ---
license: apache-2.0
---
|
communityai/aptchat-code-math-0.5k | ---
dataset_info:
features:
- name: category
dtype: string
- name: total_tokens
dtype: int64
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 7006904.0
num_examples: 581
download_size: 2903792
dataset_size: 7006904.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
vwxyzjn/ultrafeedback_binarized_1710204240 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: score_chosen
dtype: float64
- name: score_rejected
dtype: float64
- name: query
list:
- name: content
dtype: string
- name: role
dtype: string
- name: query_token
sequence: int64
- name: query_token_len
dtype: int64
- name: query_chosen_token
sequence: int64
- name: query_chosen_token_len
dtype: int64
- name: chosen_token
sequence: int64
- name: chosen_token_len
dtype: int64
- name: query_rejected_token
sequence: int64
- name: query_rejected_token_len
dtype: int64
- name: rejected_token
sequence: int64
- name: rejected_token_len
dtype: int64
splits:
- name: train_prefs
num_bytes: 975839605.6968021
num_examples: 24122
- name: test_prefs
num_bytes: 31713753.975
num_examples: 786
download_size: 113334298
dataset_size: 1007553359.6718022
configs:
- config_name: default
data_files:
- split: train_prefs
path: data/train_prefs-*
- split: test_prefs
path: data/test_prefs-*
---
|
SEACrowd/kopi_nllb | ---
tags:
- self-supervised-pretraining
language:
- ind
- jav
- ace
- ban
- bjn
- min
- sun
---
# kopi_nllb
KopI(Korpus Perayapan Indonesia)-NLLB, is Indonesian family language(aceh,bali,banjar,indonesia,jawa,minang,sunda) only extracted from NLLB Dataset, allenai/nllb
each language set also filtered using some some deduplicate technique such as exact hash(md5) dedup technique and minhash LSH neardup
## Dataset Usage
Run `pip install nusacrowd` before loading the dataset through HuggingFace's `load_dataset`.
## Citation
```
Hefferman et al, Bitext Mining Using Distilled Sentence Representations for Low-Resource Languages. Arxiv https://arxiv.org/abs/2205.12654, 2022.
NLLB Team et al, No Language Left Behind: Scaling Human-Centered Machine Translation, Arxiv https://arxiv.org/abs/2207.04672, 2022.
```
## License
ODC_C
## Homepage
[https://huggingface.co/datasets/munggok/KoPI-NLLB](https://huggingface.co/datasets/munggok/KoPI-NLLB)
### NusaCatalogue
For easy indexing and metadata: [https://indonlp.github.io/nusa-catalogue](https://indonlp.github.io/nusa-catalogue) |
coref-data/gen_winograd_raw | ---
license: cc-by-nd-4.0
---
# gen_winograd
- Project: https://ufal.mff.cuni.cz/corefud
- Data source: https://github.com/mbzuai-nlp/gen-X/tree/bf1c0adb4b4def03cdf419c18b2948695bc1fab8
## Details
English Winograd generated by GPT-4
## Citation
```
@misc{whitehouse2023llmpowered,
title={LLM-powered Data Augmentation for Enhanced Crosslingual Performance},
author={Chenxi Whitehouse and Monojit Choudhury and Alham Fikri Aji},
year={2023},
eprint={2305.14288},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
open-llm-leaderboard/details_TheBloke__Llama-2-13B-GPTQ | ---
pretty_name: Evaluation run of TheBloke/Llama-2-13B-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/Llama-2-13B-GPTQ](https://huggingface.co/TheBloke/Llama-2-13B-GPTQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Llama-2-13B-GPTQ\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-27T16:26:14.370378](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Llama-2-13B-GPTQ/blob/main/results_2023-10-27T16-26-14.370378.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0020973154362416107,\n\
\ \"em_stderr\": 0.0004685065030368251,\n \"f1\": 0.06011535234899329,\n\
\ \"f1_stderr\": 0.0013639179977941345,\n \"acc\": 0.43730302009426913,\n\
\ \"acc_stderr\": 0.010347143848267699\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0020973154362416107,\n \"em_stderr\": 0.0004685065030368251,\n\
\ \"f1\": 0.06011535234899329,\n \"f1_stderr\": 0.0013639179977941345\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11296436694465505,\n \
\ \"acc_stderr\": 0.00871933902883308\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7616416732438832,\n \"acc_stderr\": 0.011974948667702316\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TheBloke/Llama-2-13B-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|arc:challenge|25_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|arc:challenge|25_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|arc:challenge|25_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_27T16_26_14.370378
path:
- '**/details_harness|drop|3_2023-10-27T16-26-14.370378.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-27T16-26-14.370378.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_27T16_26_14.370378
path:
- '**/details_harness|gsm8k|5_2023-10-27T16-26-14.370378.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-27T16-26-14.370378.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hellaswag|10_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hellaswag|10_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hellaswag|10_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T15:04:20.709230.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T10:42:39.395336.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T11:12:42.998068.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T15:04:20.709230.parquet'
- split: 2023_08_30T10_42_39.395336
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T10:42:39.395336.parquet'
- split: 2023_08_31T11_12_42.998068
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T11:12:42.998068.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T11:12:42.998068.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_27T16_26_14.370378
path:
- '**/details_harness|winogrande|5_2023-10-27T16-26-14.370378.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-27T16-26-14.370378.parquet'
- config_name: results
data_files:
- split: 2023_08_29T15_04_20.709230
path:
- results_2023-08-29T15:04:20.709230.parquet
- split: 2023_08_30T10_42_39.395336
path:
- results_2023-08-30T10:42:39.395336.parquet
- split: 2023_08_31T11_12_42.998068
path:
- results_2023-08-31T11:12:42.998068.parquet
- split: 2023_10_27T16_26_14.370378
path:
- results_2023-10-27T16-26-14.370378.parquet
- split: latest
path:
- results_2023-10-27T16-26-14.370378.parquet
---
# Dataset Card for Evaluation run of TheBloke/Llama-2-13B-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Llama-2-13B-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Llama-2-13B-GPTQ](https://huggingface.co/TheBloke/Llama-2-13B-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Llama-2-13B-GPTQ",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-27T16:26:14.370378](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Llama-2-13B-GPTQ/blob/main/results_2023-10-27T16-26-14.370378.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0020973154362416107,
"em_stderr": 0.0004685065030368251,
"f1": 0.06011535234899329,
"f1_stderr": 0.0013639179977941345,
"acc": 0.43730302009426913,
"acc_stderr": 0.010347143848267699
},
"harness|drop|3": {
"em": 0.0020973154362416107,
"em_stderr": 0.0004685065030368251,
"f1": 0.06011535234899329,
"f1_stderr": 0.0013639179977941345
},
"harness|gsm8k|5": {
"acc": 0.11296436694465505,
"acc_stderr": 0.00871933902883308
},
"harness|winogrande|5": {
"acc": 0.7616416732438832,
"acc_stderr": 0.011974948667702316
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
fanwei1/test | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: image
sequence:
sequence:
sequence:
sequence: float32
- name: h264
sequence:
sequence: uint8
- name: id
sequence: string
- name: size
sequence: int64
splits:
- name: train
num_bytes: 296990
num_examples: 1
- name: validation
num_bytes: 299714
num_examples: 1
download_size: 435230
dataset_size: 596704
---
|
open-llm-leaderboard/details_togethercomputer__GPT-JT-6B-v0 | ---
pretty_name: Evaluation run of togethercomputer/GPT-JT-6B-v0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [togethercomputer/GPT-JT-6B-v0](https://huggingface.co/togethercomputer/GPT-JT-6B-v0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_togethercomputer__GPT-JT-6B-v0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-17T19:26:54.220051](https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__GPT-JT-6B-v0/blob/main/results_2023-10-17T19-26-54.220051.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0010486577181208054,\n\
\ \"em_stderr\": 0.0003314581465219154,\n \"f1\": 0.043061031879194765,\n\
\ \"f1_stderr\": 0.0011437900819203201,\n \"acc\": 0.330058886781919,\n\
\ \"acc_stderr\": 0.008219084533910332\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.0003314581465219154,\n\
\ \"f1\": 0.043061031879194765,\n \"f1_stderr\": 0.0011437900819203201\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.012130401819560273,\n \
\ \"acc_stderr\": 0.003015294242890946\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6479873717442778,\n \"acc_stderr\": 0.013422874824929718\n\
\ }\n}\n```"
repo_url: https://huggingface.co/togethercomputer/GPT-JT-6B-v0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|arc:challenge|25_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_17T19_26_54.220051
path:
- '**/details_harness|drop|3_2023-10-17T19-26-54.220051.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-17T19-26-54.220051.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_17T19_26_54.220051
path:
- '**/details_harness|gsm8k|5_2023-10-17T19-26-54.220051.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-17T19-26-54.220051.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hellaswag|10_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:42:14.994932.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T15:42:14.994932.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T15:42:14.994932.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_17T19_26_54.220051
path:
- '**/details_harness|winogrande|5_2023-10-17T19-26-54.220051.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-17T19-26-54.220051.parquet'
- config_name: results
data_files:
- split: 2023_07_19T15_42_14.994932
path:
- results_2023-07-19T15:42:14.994932.parquet
- split: 2023_10_17T19_26_54.220051
path:
- results_2023-10-17T19-26-54.220051.parquet
- split: latest
path:
- results_2023-10-17T19-26-54.220051.parquet
---
# Dataset Card for Evaluation run of togethercomputer/GPT-JT-6B-v0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/togethercomputer/GPT-JT-6B-v0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [togethercomputer/GPT-JT-6B-v0](https://huggingface.co/togethercomputer/GPT-JT-6B-v0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_togethercomputer__GPT-JT-6B-v0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T19:26:54.220051](https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__GPT-JT-6B-v0/blob/main/results_2023-10-17T19-26-54.220051.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0010486577181208054,
"em_stderr": 0.0003314581465219154,
"f1": 0.043061031879194765,
"f1_stderr": 0.0011437900819203201,
"acc": 0.330058886781919,
"acc_stderr": 0.008219084533910332
},
"harness|drop|3": {
"em": 0.0010486577181208054,
"em_stderr": 0.0003314581465219154,
"f1": 0.043061031879194765,
"f1_stderr": 0.0011437900819203201
},
"harness|gsm8k|5": {
"acc": 0.012130401819560273,
"acc_stderr": 0.003015294242890946
},
"harness|winogrande|5": {
"acc": 0.6479873717442778,
"acc_stderr": 0.013422874824929718
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
yzhuang/metatree_fri_c0_1000_25 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: X
sequence: float64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 157080
num_examples: 714
- name: validation
num_bytes: 62920
num_examples: 286
download_size: 254313
dataset_size: 220000
---
# Dataset Card for "metatree_fri_c0_1000_25"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aliciiavs/chord_images | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': A
'1': A7
'2': Am
'3': C
'4': D
'5': D7
'6': E
'7': Em
'8': G
splits:
- name: train
num_bytes: 210864168.8
num_examples: 1800
download_size: 205050921
dataset_size: 210864168.8
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
BubbleJoe/multi_nli_unified_input | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation_matched
path: data/validation_matched-*
- split: validation_mismatched
path: data/validation_mismatched-*
dataset_info:
features:
- name: promptID
dtype: int32
- name: pairID
dtype: string
- name: premise
dtype: string
- name: premise_binary_parse
dtype: string
- name: premise_parse
dtype: string
- name: hypothesis
dtype: string
- name: hypothesis_binary_parse
dtype: string
- name: hypothesis_parse
dtype: string
- name: genre
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: input
dtype: string
splits:
- name: train
num_bytes: 487186164
num_examples: 392702
- name: validation_matched
num_bytes: 11956580
num_examples: 9815
- name: validation_mismatched
num_bytes: 12618412
num_examples: 9832
download_size: 272284496
dataset_size: 511761156
---
# Dataset Card for "multi_nli_unified_input"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jxie/natural_questions | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 6059360
num_examples: 87925
- name: test
num_bytes: 253307
num_examples: 3610
download_size: 0
dataset_size: 6312667
---
# Dataset Card for "natural_questions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aaditya/orca_dpo_pairs-Hindi | ---
dataset_info:
features:
- name: id
dtype: string
- name: codemix_system
dtype: string
- name: codemix_question
dtype: string
- name: codemix_chosen
dtype: string
- name: codemix_rejected
dtype: string
- name: codemix_question_type
dtype: string
- name: en_system
dtype: string
- name: en_question
dtype: string
- name: en_chosen
dtype: string
- name: en_rejected
dtype: string
splits:
- name: train
num_bytes: 51127339
num_examples: 10305
download_size: 27467174
dataset_size: 51127339
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Summary
`aaditya/orca_dpo_pairs-Hindi` is an open source Hindi version dataset of Intel/orca_dpo_pairs
This dataset can be used for any purpose, whether academic or commercial, under the terms of the
[Creative Commons Attribution-ShareAlike 3.0 Unported License](https://creativecommons.org/licenses/by-sa/3.0/legalcode).
Supported Tasks:
- Training LLMs
- Synthetic Data Generation
- Data Augmentation
Languages: Hindi
Version: 1.0
# Citation
```
@misc {orca_dpo_hindi,
author = { Pal, Ankit },
title = { orca_dpo_pairs-Hindi},
year = 2024,
url = { https://huggingface.co/datasets/aaditya/orca_dpo_pairs-Hindi },
doi = { 10.57967/hf/1759 },
publisher = { Hugging Face }
}
``` |
Rasu23/iapp_all_train_test_iter0 | ---
dataset_info:
features:
- name: question_id
dtype: string
- name: article_id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
dtype: string
- name: 'Unnamed: 0'
dtype: int64
- name: id
dtype: string
- name: references
dtype: string
- name: predictions
dtype: string
splits:
- name: train
num_bytes: 17356533
num_examples: 5761
- name: test
num_bytes: 2199490
num_examples: 739
download_size: 3155434
dataset_size: 19556023
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
micsell/hebrew_kan_sentence100000 | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: id
dtype: string
- name: language
dtype: string
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 1894199082.0
num_examples: 10000
download_size: 1893355737
dataset_size: 1894199082.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_PracticeLLM__SOLAR-tail-10.7B-Merge-v1.0 | ---
pretty_name: Evaluation run of PracticeLLM/SOLAR-tail-10.7B-Merge-v1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PracticeLLM/SOLAR-tail-10.7B-Merge-v1.0](https://huggingface.co/PracticeLLM/SOLAR-tail-10.7B-Merge-v1.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PracticeLLM__SOLAR-tail-10.7B-Merge-v1.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-29T11:11:59.721182](https://huggingface.co/datasets/open-llm-leaderboard/details_PracticeLLM__SOLAR-tail-10.7B-Merge-v1.0/blob/main/results_2023-12-29T11-11-59.721182.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6676932083825828,\n\
\ \"acc_stderr\": 0.03141884754120868,\n \"acc_norm\": 0.6685033288172079,\n\
\ \"acc_norm_stderr\": 0.03206517056378548,\n \"mc1\": 0.45165238678090575,\n\
\ \"mc1_stderr\": 0.01742148030027764,\n \"mc2\": 0.6056804036036146,\n\
\ \"mc2_stderr\": 0.015579014786964863\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6313993174061433,\n \"acc_stderr\": 0.014097810678042196,\n\
\ \"acc_norm\": 0.6612627986348123,\n \"acc_norm_stderr\": 0.013830568927974332\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6793467436765585,\n\
\ \"acc_stderr\": 0.004657738398900938,\n \"acc_norm\": 0.8653654650468035,\n\
\ \"acc_norm_stderr\": 0.0034063520713417243\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.042320736951515885,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.042320736951515885\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7763157894736842,\n \"acc_stderr\": 0.03391160934343604,\n\
\ \"acc_norm\": 0.7763157894736842,\n \"acc_norm_stderr\": 0.03391160934343604\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438662,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438662\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6994219653179191,\n\
\ \"acc_stderr\": 0.034961014811911786,\n \"acc_norm\": 0.6994219653179191,\n\
\ \"acc_norm_stderr\": 0.034961014811911786\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6297872340425532,\n \"acc_stderr\": 0.03156564682236785,\n\
\ \"acc_norm\": 0.6297872340425532,\n \"acc_norm_stderr\": 0.03156564682236785\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.039966295748767186,\n\
\ \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.039966295748767186\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.47354497354497355,\n \"acc_stderr\": 0.025715239811346758,\n \"\
acc_norm\": 0.47354497354497355,\n \"acc_norm_stderr\": 0.025715239811346758\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8032258064516129,\n\
\ \"acc_stderr\": 0.022616409420742025,\n \"acc_norm\": 0.8032258064516129,\n\
\ \"acc_norm_stderr\": 0.022616409420742025\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.030874145136562094,\n\
\ \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.030874145136562094\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822516,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822516\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.362962962962963,\n \"acc_stderr\": 0.02931820364520686,\n \
\ \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.02931820364520686\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887027,\n\
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887027\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6064814814814815,\n \"acc_stderr\": 0.03331747876370312,\n \"\
acc_norm\": 0.6064814814814815,\n \"acc_norm_stderr\": 0.03331747876370312\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.02485747808025046,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.02485747808025046\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8523206751054853,\n \"acc_stderr\": 0.023094329582595698,\n \
\ \"acc_norm\": 0.8523206751054853,\n \"acc_norm_stderr\": 0.023094329582595698\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7309417040358744,\n\
\ \"acc_stderr\": 0.02976377940687497,\n \"acc_norm\": 0.7309417040358744,\n\
\ \"acc_norm_stderr\": 0.02976377940687497\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728745,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728745\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097654,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097654\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026622,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026622\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281382,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281382\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834832,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834832\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n\
\ \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39329608938547483,\n\
\ \"acc_stderr\": 0.01633726869427011,\n \"acc_norm\": 0.39329608938547483,\n\
\ \"acc_norm_stderr\": 0.01633726869427011\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.0239291555173513,\n\
\ \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.0239291555173513\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.023016705640262192,\n\
\ \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.023016705640262192\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5177304964539007,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.5177304964539007,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49282920469361147,\n\
\ \"acc_stderr\": 0.012768922739553308,\n \"acc_norm\": 0.49282920469361147,\n\
\ \"acc_norm_stderr\": 0.012768922739553308\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.025767252010855956,\n\
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.025767252010855956\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.704248366013072,\n \"acc_stderr\": 0.01846315413263281,\n \
\ \"acc_norm\": 0.704248366013072,\n \"acc_norm_stderr\": 0.01846315413263281\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7795918367346939,\n \"acc_stderr\": 0.02653704531214529,\n\
\ \"acc_norm\": 0.7795918367346939,\n \"acc_norm_stderr\": 0.02653704531214529\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306053,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306053\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.45165238678090575,\n\
\ \"mc1_stderr\": 0.01742148030027764,\n \"mc2\": 0.6056804036036146,\n\
\ \"mc2_stderr\": 0.015579014786964863\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8476716653512234,\n \"acc_stderr\": 0.010099208246065597\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6557998483699773,\n \
\ \"acc_stderr\": 0.013086800426693785\n }\n}\n```"
repo_url: https://huggingface.co/PracticeLLM/SOLAR-tail-10.7B-Merge-v1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|arc:challenge|25_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|gsm8k|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hellaswag|10_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T11-11-59.721182.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T11-11-59.721182.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- '**/details_harness|winogrande|5_2023-12-29T11-11-59.721182.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-29T11-11-59.721182.parquet'
- config_name: results
data_files:
- split: 2023_12_29T11_11_59.721182
path:
- results_2023-12-29T11-11-59.721182.parquet
- split: latest
path:
- results_2023-12-29T11-11-59.721182.parquet
---
# Dataset Card for Evaluation run of PracticeLLM/SOLAR-tail-10.7B-Merge-v1.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [PracticeLLM/SOLAR-tail-10.7B-Merge-v1.0](https://huggingface.co/PracticeLLM/SOLAR-tail-10.7B-Merge-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PracticeLLM__SOLAR-tail-10.7B-Merge-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T11:11:59.721182](https://huggingface.co/datasets/open-llm-leaderboard/details_PracticeLLM__SOLAR-tail-10.7B-Merge-v1.0/blob/main/results_2023-12-29T11-11-59.721182.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6676932083825828,
"acc_stderr": 0.03141884754120868,
"acc_norm": 0.6685033288172079,
"acc_norm_stderr": 0.03206517056378548,
"mc1": 0.45165238678090575,
"mc1_stderr": 0.01742148030027764,
"mc2": 0.6056804036036146,
"mc2_stderr": 0.015579014786964863
},
"harness|arc:challenge|25": {
"acc": 0.6313993174061433,
"acc_stderr": 0.014097810678042196,
"acc_norm": 0.6612627986348123,
"acc_norm_stderr": 0.013830568927974332
},
"harness|hellaswag|10": {
"acc": 0.6793467436765585,
"acc_stderr": 0.004657738398900938,
"acc_norm": 0.8653654650468035,
"acc_norm_stderr": 0.0034063520713417243
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.042320736951515885,
"acc_norm": 0.6,
"acc_norm_stderr": 0.042320736951515885
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7763157894736842,
"acc_stderr": 0.03391160934343604,
"acc_norm": 0.7763157894736842,
"acc_norm_stderr": 0.03391160934343604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438662,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438662
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.034961014811911786,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.034961014811911786
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6297872340425532,
"acc_stderr": 0.03156564682236785,
"acc_norm": 0.6297872340425532,
"acc_norm_stderr": 0.03156564682236785
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6413793103448275,
"acc_stderr": 0.039966295748767186,
"acc_norm": 0.6413793103448275,
"acc_norm_stderr": 0.039966295748767186
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47354497354497355,
"acc_stderr": 0.025715239811346758,
"acc_norm": 0.47354497354497355,
"acc_norm_stderr": 0.025715239811346758
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8032258064516129,
"acc_stderr": 0.022616409420742025,
"acc_norm": 0.8032258064516129,
"acc_norm_stderr": 0.022616409420742025
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.030874145136562094,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.030874145136562094
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822516,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822516
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.02931820364520686,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.02931820364520686
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887027,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887027
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6064814814814815,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.6064814814814815,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.02485747808025046,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.02485747808025046
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8523206751054853,
"acc_stderr": 0.023094329582595698,
"acc_norm": 0.8523206751054853,
"acc_norm_stderr": 0.023094329582595698
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7309417040358744,
"acc_stderr": 0.02976377940687497,
"acc_norm": 0.7309417040358744,
"acc_norm_stderr": 0.02976377940687497
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728745,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728745
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097654,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097654
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026622,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026622
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281382,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281382
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834832,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834832
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.02335736578587403,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.02335736578587403
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39329608938547483,
"acc_stderr": 0.01633726869427011,
"acc_norm": 0.39329608938547483,
"acc_norm_stderr": 0.01633726869427011
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.0239291555173513,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.0239291555173513
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7808641975308642,
"acc_stderr": 0.023016705640262192,
"acc_norm": 0.7808641975308642,
"acc_norm_stderr": 0.023016705640262192
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5177304964539007,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.5177304964539007,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49282920469361147,
"acc_stderr": 0.012768922739553308,
"acc_norm": 0.49282920469361147,
"acc_norm_stderr": 0.012768922739553308
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.025767252010855956,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.025767252010855956
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.704248366013072,
"acc_stderr": 0.01846315413263281,
"acc_norm": 0.704248366013072,
"acc_norm_stderr": 0.01846315413263281
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7795918367346939,
"acc_stderr": 0.02653704531214529,
"acc_norm": 0.7795918367346939,
"acc_norm_stderr": 0.02653704531214529
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306053,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306053
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.45165238678090575,
"mc1_stderr": 0.01742148030027764,
"mc2": 0.6056804036036146,
"mc2_stderr": 0.015579014786964863
},
"harness|winogrande|5": {
"acc": 0.8476716653512234,
"acc_stderr": 0.010099208246065597
},
"harness|gsm8k|5": {
"acc": 0.6557998483699773,
"acc_stderr": 0.013086800426693785
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
GeorgeGuo/detect | ---
license: apache-2.0
task_categories:
- text-classification
language:
- zh
tags:
- music
size_categories:
- 10K<n<100K
---
This is dataset for test |
TinyPixel/elm | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2542932
num_examples: 1073
download_size: 1390964
dataset_size: 2542932
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "elm"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sick | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- en
license:
- cc-by-nc-sa-3.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- extended|image-flickr-8k
- extended|semeval2012-sts-msr-video
task_categories:
- text-classification
task_ids:
- natural-language-inference
paperswithcode_id: sick
pretty_name: Sentences Involving Compositional Knowledge
dataset_info:
features:
- name: id
dtype: string
- name: sentence_A
dtype: string
- name: sentence_B
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: relatedness_score
dtype: float32
- name: entailment_AB
dtype: string
- name: entailment_BA
dtype: string
- name: sentence_A_original
dtype: string
- name: sentence_B_original
dtype: string
- name: sentence_A_dataset
dtype: string
- name: sentence_B_dataset
dtype: string
splits:
- name: train
num_bytes: 1180530
num_examples: 4439
- name: validation
num_bytes: 132913
num_examples: 495
- name: test
num_bytes: 1305846
num_examples: 4906
download_size: 217584
dataset_size: 2619289
---
# Dataset Card for sick
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** http://marcobaroni.org/composes/sick.html
- **Repository:** [Needs More Information]
- **Paper:** https://www.aclweb.org/anthology/L14-1314/
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** [Needs More Information]
### Dataset Summary
Shared and internationally recognized benchmarks are fundamental for the development of any computational system. We aim to help the research community working on compositional distributional semantic models (CDSMs) by providing SICK (Sentences Involving Compositional Knowldedge), a large size English benchmark tailored for them. SICK consists of about 10,000 English sentence pairs that include many examples of the lexical, syntactic and semantic phenomena that CDSMs are expected to account for, but do not require dealing with other aspects of existing sentential data sets (idiomatic multiword expressions, named entities, telegraphic language) that are not within the scope of CDSMs. By means of crowdsourcing techniques, each pair was annotated for two crucial semantic tasks: relatedness in meaning (with a 5-point rating scale as gold score) and entailment relation between the two elements (with three possible gold labels: entailment, contradiction, and neutral). The SICK data set was used in SemEval-2014 Task 1, and it freely available for research purposes.
### Supported Tasks and Leaderboards
[Needs More Information]
### Languages
The dataset is in English.
## Dataset Structure
### Data Instances
Example instance:
```
{
"entailment_AB": "A_neutral_B",
"entailment_BA": "B_neutral_A",
"label": 1,
"id": "1",
"relatedness_score": 4.5,
"sentence_A": "A group of kids is playing in a yard and an old man is standing in the background",
"sentence_A_dataset": "FLICKR",
"sentence_A_original": "A group of children playing in a yard, a man in the background.",
"sentence_B": "A group of boys in a yard is playing and a man is standing in the background",
"sentence_B_dataset": "FLICKR",
"sentence_B_original": "A group of children playing in a yard, a man in the background."
}
```
### Data Fields
- pair_ID: sentence pair ID
- sentence_A: sentence A
- sentence_B: sentence B
- label: textual entailment gold label: entailment (0), neutral (1) or contradiction (2)
- relatedness_score: semantic relatedness gold score (on a 1-5 continuous scale)
- entailment_AB: entailment for the A-B order (A_neutral_B, A_entails_B, or A_contradicts_B)
- entailment_BA: entailment for the B-A order (B_neutral_A, B_entails_A, or B_contradicts_A)
- sentence_A_original: original sentence from which sentence A is derived
- sentence_B_original: original sentence from which sentence B is derived
- sentence_A_dataset: dataset from which the original sentence A was extracted (FLICKR vs. SEMEVAL)
- sentence_B_dataset: dataset from which the original sentence B was extracted (FLICKR vs. SEMEVAL)
### Data Splits
Train Trial Test
4439 495 4906
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
```
@inproceedings{marelli-etal-2014-sick,
title = "A {SICK} cure for the evaluation of compositional distributional semantic models",
author = "Marelli, Marco and
Menini, Stefano and
Baroni, Marco and
Bentivogli, Luisa and
Bernardi, Raffaella and
Zamparelli, Roberto",
booktitle = "Proceedings of the Ninth International Conference on Language Resources and Evaluation ({LREC}'14)",
month = may,
year = "2014",
address = "Reykjavik, Iceland",
publisher = "European Language Resources Association (ELRA)",
url = "http://www.lrec-conf.org/proceedings/lrec2014/pdf/363_Paper.pdf",
pages = "216--223",
}
```
### Contributions
Thanks to [@calpt](https://github.com/calpt) for adding this dataset. |
amanrangapur/Fin-Fact | ---
license: apache-2.0
task_categories:
- text-classification
- text-generation
language:
- en
tags:
- finance
pretty_name: FinFact
size_categories:
- 1K<n<10K
dataset_info:
- config_name: generation
features:
- name: url
dtype: string
- name: claim
dtype: string
- name: author
dtype: string
- name: posted
dtype: string
# - name: sci_digest
# sequence: string
# - name: justification
# sequence: string
# - name: issues
# dtype: string
# - name: image_data
# sequence:
# - name: image_src
# dtype: string
# - name: image_caption
# dtype: string
# - name: evidence
# sequence:
# - name: sentence
# dtype: string
# - name: hrefs
# dtype: string
# - name: label
# dtype: string
# - name: visualization_bias
# dtype: int32
---
<h1 align="center">Fin-Fact - Financial Fact-Checking Dataset</h1>
## Table of Contents
- [Overview](#overview)
- [Dataset Description](#dataset-description)
- [Dataset Usage](#dataset-usage)
- [Leaderboard](#leaderboard)
- [Dependencies](#dependencies)
- [Run models for paper metrics](#run-models-for-paper-metrics)
- [Citation](#citation)
- [Contribution](#contribution)
- [License](#license)
- [Contact](#contact)
## Overview
Welcome to the Fin-Fact repository! Fin-Fact is a comprehensive dataset designed specifically for financial fact-checking and explanation generation. This README provides an overview of the dataset, how to use it, and other relevant information. [Click here](https://arxiv.org/abs/2309.08793) to access the paper.
## Dataset Description
- **Name**: Fin-Fact
- **Purpose**: Fact-checking and explanation generation in the financial domain.
- **Labels**: The dataset includes various labels, including Claim, Author, Posted Date, Sci-digest, Justification, Evidence, Evidence href, Image href, Image Caption, Visualisation Bias Label, Issues, and Claim Label.
- **Size**: The dataset consists of 3121 claims spanning multiple financial sectors.
- **Additional Features**: The dataset goes beyond textual claims and incorporates visual elements, including images and their captions.
## Dataset Usage
Fin-Fact is a valuable resource for researchers, data scientists, and fact-checkers in the financial domain. Here's how you can use it:
1. **Download the Dataset**: You can download the Fin-Fact dataset [here](https://github.com/IIT-DM/Fin-Fact/blob/FinFact/finfact.json).
2. **Exploratory Data Analysis**: Perform exploratory data analysis to understand the dataset's structure, distribution, and any potential biases.
3. **Natural Language Processing (NLP) Tasks**: Utilize the dataset for various NLP tasks such as fact-checking, claim verification, and explanation generation.
4. **Fact Checking Experiments**: Train and evaluate machine learning models, including text and image analysis, using the dataset to enhance the accuracy of fact-checking systems.
## Leaderboard
## Dependencies
We recommend you create an anaconda environment:
`conda create --name finfact python=3.6 conda-build`
Then, install Python requirements:
`pip install -r requirements.txt`
## Run models for paper metrics
We provide scripts let you easily run our dataset on existing state-of-the-art models and re-create the metrics published in paper. You should be able to reproduce our results from the paper by following these instructions. Please post an issue if you're unable to do this.
To run existing ANLI models for fact checking.
### Run:
1. BART
```bash
python anli.py --model_name 'ynie/bart-large-snli_mnli_fever_anli_R1_R2_R3-nli' --data_file finfact.json --threshold 0.5
```
2. RoBERTa
```bash
python anli.py --model_name 'ynie/roberta-large-snli_mnli_fever_anli_R1_R2_R3-nli' --data_file finfact.json --threshold 0.5
```
3. ELECTRA
```bash
python anli.py --model_name 'ynie/electra-large-discriminator-snli_mnli_fever_anli_R1_R2_R3-nli' --data_file finfact.json --threshold 0.5
```
4. AlBERT
```bash
python anli.py --model_name 'ynie/albert-xxlarge-v2-snli_mnli_fever_anli_R1_R2_R3-nli' --data_file finfact.json --threshold 0.5
```
5. XLNET
```bash
python anli.py --model_name 'ynie/xlnet-large-cased-snli_mnli_fever_anli_R1_R2_R3-nli' --data_file finfact.json --threshold 0.5
```
6. GPT-2
```bash
python gpt2_nli.py --model_name 'fractalego/fact-checking' --data_file finfact.json
```
## Citation
```
@misc{rangapur2023finfact,
title={Fin-Fact: A Benchmark Dataset for Multimodal Financial Fact Checking and Explanation Generation},
author={Aman Rangapur and Haoran Wang and Kai Shu},
year={2023},
eprint={2309.08793},
archivePrefix={arXiv},
primaryClass={cs.AI}
}
```
## Contribution
We welcome contributions from the community to help improve Fin-Fact. If you have suggestions, bug reports, or want to contribute code or data, please check our [CONTRIBUTING.md](CONTRIBUTING.md) file for guidelines.
## License
Fin-Fact is released under the [MIT License](/LICENSE). Please review the license before using the dataset.
## Contact
For questions, feedback, or inquiries related to Fin-Fact, please contact `arangapur@hawk.iit.edu`.
We hope you find Fin-Fact valuable for your research and fact-checking endeavors. Happy fact-checking!
|
Seanxh/twitter_dataset_1713225737 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 243707
num_examples: 562
download_size: 78336
dataset_size: 243707
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
qgallouedec/prj_gia_dataset_metaworld_lever_pull_v2_1111 | ---
library_name: gia
tags:
- deep-reinforcement-learning
- reinforcement-learning
- gia
- multi-task
- multi-modal
- imitation-learning
- offline-reinforcement-learning
---
An imitation learning environment for the lever-pull-v2 environment, sample for the policy lever-pull-v2
This environment was created as part of the Generally Intelligent Agents project gia: https://github.com/huggingface/gia
## Load dataset
First, clone it with
```sh
git clone https://huggingface.co/datasets/qgallouedec/prj_gia_dataset_metaworld_lever_pull_v2_1111
```
Then, load it with
```python
import numpy as np
dataset = np.load("prj_gia_dataset_metaworld_lever_pull_v2_1111/dataset.npy", allow_pickle=True).item()
print(dataset.keys()) # dict_keys(['observations', 'actions', 'dones', 'rewards'])
```
|
liuyanchen1015/VALUE_qnli_lexical | ---
dataset_info:
features:
- name: question
dtype: string
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 1289179
num_examples: 4844
- name: test
num_bytes: 1290872
num_examples: 4829
- name: train
num_bytes: 24041472
num_examples: 92638
download_size: 17902736
dataset_size: 26621523
---
# Dataset Card for "VALUE_qnli_lexical"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
traintogpb/aihub-koen-translation-integrated-base-10m | ---
task_categories:
- translation
language:
- en
- ko
size_categories:
- 10M<n<100M
---
# AI Hub Ko-En Translation Dataset (Integrated)
AI Hub의 한-영 번역 관련 데이터셋 8개를 병합한 자료입니다.
병합 시 총 데이터 개수는 10,416,509개 이며, train / validation / test는 8:1:1 비율로 분할되었습니다.
- base-10m: 병합 데이터 100% 사용, 총 10,416,509개
- mini-1m: 병합 데이터 10% 사용 (base-10m의 각 세트 내에서 10% 임의 선택), 총 1,041,651개
- tiny-100k: 병합 데이터 1% 사용 (base-10m의 각 세트 내에서 1% 임의 선택), 총 104,165개
## Subsets
활용한 데이터셋 목록은 다음과 같으며, 데이터셋 이름 옆 번호는 aihubshell에서의 datasetkey입니다.
- [전문분야 한영 말뭉치](https://www.aihub.or.kr/aihubdata/data/view.do?currMenu=&topMenu=&aihubDataSe=data&dataSetSn=111) (111)
- 총 개수: 1,350,000
- 중복 제거 후 개수: 1,350,000
- 사용 칼럼: '한국어', '영어'
- [한국어-영어 번역 말뭉치(기술과학)](https://www.aihub.or.kr/aihubdata/data/view.do?currMenu=&topMenu=&aihubDataSe=data&dataSetSn=124) (124)
- 총 개수: 1,344,631
- 중복 제거 후 개수: 1,344,631
- 사용 칼럼: 'ko', 'en'
- [한국어-영어 번역 말뭉치(사회과학)](https://www.aihub.or.kr/aihubdata/data/view.do?currMenu=&topMenu=&aihubDataSe=data&dataSetSn=125) (125)
- 총 개수: 1,361,845
- 중복 제거 후 개수: 1,361,825
- 사용 칼럼: 'ko', 'en'
- [한국어-영어 번역(병렬) 말뭉치](https://www.aihub.or.kr/aihubdata/data/view.do?currMenu=&topMenu=&aihubDataSe=data&dataSetSn=126) (126)
- 총 개수: 1,602,418
- 중복 제거 후 개수: 1,599,924
- 사용 칼럼: '원문', '번역문'
- [산업정보 연계 주요국 특허 영-한 데이터](https://www.aihub.or.kr/aihubdata/data/view.do?currMenu=&topMenu=&aihubDataSe=data&dataSetSn=563) (563)
- 총 개수: 359,999
- 중복 제거 후 개수: 358,424
- 사용 칼럼: 'astrt_cont_kor', 'astrt_cont_eng'
- [일상생활 및 구어체 한-영 번역 병렬 말뭉치 데이터](https://www.aihub.or.kr/aihubdata/data/view.do?currMenu=&topMenu=&aihubDataSe=data&dataSetSn=71265) (71265)
- 총 개수: 2,700,345
- 중복 제거 후 개수: 2,486,058
- 사용 칼럼: 'ko', 'en'
- [기술과학 분야 한-영 번역 병렬 말뭉치 데이터](https://www.aihub.or.kr/aihubdata/data/view.do?currMenu=&topMenu=&aihubDataSe=data&dataSetSn=71266) (71266)
- 총 개수: 1,350,162
- 중복 제거 후 개수: 1,328,987
- 사용 칼럼: 'ko', 'en'
- [방송콘텐츠 한국어-영어 번역 말뭉치](https://www.aihub.or.kr/aihubdata/data/view.do?currMenu=&topMenu=&aihubDataSe=data&dataSetSn=71382) (71382)
- 총 개수: 587,084
- 중복 제거 후 개수: 586,660
- 사용 칼럼: '원문', '최종번역문'
|
Viniciaao/HardLevel | ---
license: openrail
---
|
TrainingDataPro/license_plates | ---
license: cc-by-nc-nd-4.0
task_categories:
- image-to-text
language:
- en
tags:
- finance
dataset_info:
- config_name: Brazil_youtube
features:
- name: image
dtype: image
- name: labeled_image
dtype: image
- name: bbox
dtype: string
- name: license_plate.id
dtype: string
- name: license_plate.visibility
dtype: string
- name: license_plate.rows_count
dtype: uint8
- name: license_plate.number
dtype: string
- name: license_plate.serial
dtype: string
- name: license_plate.country
dtype: string
- name: license_plate.mask
dtype: string
splits:
- name: train
num_bytes: 173536648
num_examples: 72
download_size: 22606962
dataset_size: 173536648
- config_name: Estonia_platesmania
features:
- name: image
dtype: image
- name: labeled_image
dtype: image
- name: bbox
dtype: string
- name: license_plate.id
dtype: string
- name: license_plate.visibility
dtype: string
- name: license_plate.rows_count
dtype: uint8
- name: license_plate.number
dtype: string
- name: license_plate.serial
dtype: string
- name: license_plate.country
dtype: string
- name: license_plate.mask
dtype: string
splits:
- name: train
num_bytes: 7990452
num_examples: 10
download_size: 7863164
dataset_size: 7990452
- config_name: Finland_platesmania
features:
- name: image
dtype: image
- name: labeled_image
dtype: image
- name: bbox
dtype: string
- name: license_plate.id
dtype: string
- name: license_plate.visibility
dtype: string
- name: license_plate.rows_count
dtype: uint8
- name: license_plate.number
dtype: string
- name: license_plate.serial
dtype: string
- name: license_plate.country
dtype: string
- name: license_plate.mask
dtype: string
splits:
- name: train
num_bytes: 9650579
num_examples: 10
download_size: 9485725
dataset_size: 9650579
- config_name: Kazakhstan_platesmania
features:
- name: image
dtype: image
- name: labeled_image
dtype: image
- name: bbox
dtype: string
- name: license_plate.id
dtype: string
- name: license_plate.visibility
dtype: string
- name: license_plate.rows_count
dtype: uint8
- name: license_plate.number
dtype: string
- name: license_plate.serial
dtype: string
- name: license_plate.country
dtype: string
- name: license_plate.mask
dtype: string
splits:
- name: train
num_bytes: 14064541
num_examples: 19
download_size: 7265915
dataset_size: 14064541
- config_name: Kazakhstan_youtube
features:
- name: image
dtype: image
- name: labeled_image
dtype: image
- name: bbox
dtype: string
- name: license_plate.id
dtype: string
- name: license_plate.visibility
dtype: string
- name: license_plate.rows_count
dtype: uint8
- name: license_plate.number
dtype: string
- name: license_plate.serial
dtype: string
- name: license_plate.country
dtype: string
- name: license_plate.mask
dtype: string
splits:
- name: train
num_bytes: 6324396
num_examples: 22
download_size: 2852873
dataset_size: 6324396
- config_name: Lithuania_platesmania
features:
- name: image
dtype: image
- name: labeled_image
dtype: image
- name: bbox
dtype: string
- name: license_plate.id
dtype: string
- name: license_plate.visibility
dtype: string
- name: license_plate.rows_count
dtype: uint8
- name: license_plate.number
dtype: string
- name: license_plate.serial
dtype: string
- name: license_plate.country
dtype: string
- name: license_plate.mask
dtype: string
splits:
- name: train
num_bytes: 8127614
num_examples: 10
download_size: 7940839
dataset_size: 8127614
- config_name: Serbia_platesmania
features:
- name: image
dtype: image
- name: labeled_image
dtype: image
- name: bbox
dtype: string
- name: license_plate.id
dtype: string
- name: license_plate.visibility
dtype: string
- name: license_plate.rows_count
dtype: uint8
- name: license_plate.number
dtype: string
- name: license_plate.serial
dtype: string
- name: license_plate.country
dtype: string
- name: license_plate.mask
dtype: string
splits:
- name: train
num_bytes: 10000777
num_examples: 10
download_size: 9808356
dataset_size: 10000777
- config_name: Serbia_youtube
features:
- name: image
dtype: image
- name: labeled_image
dtype: image
- name: bbox
dtype: string
- name: license_plate.id
dtype: string
- name: license_plate.visibility
dtype: string
- name: license_plate.rows_count
dtype: uint8
- name: license_plate.number
dtype: string
- name: license_plate.serial
dtype: string
- name: license_plate.country
dtype: string
- name: license_plate.mask
dtype: string
splits:
- name: train
num_bytes: 26535839
num_examples: 67
download_size: 4044272
dataset_size: 26535839
- config_name: UAE_platesmania
features:
- name: image
dtype: image
- name: labeled_image
dtype: image
- name: bbox
dtype: string
- name: license_plate.id
dtype: string
- name: license_plate.visibility
dtype: string
- name: license_plate.rows_count
dtype: uint8
- name: license_plate.number
dtype: string
- name: license_plate.serial
dtype: string
- name: license_plate.country
dtype: string
- name: license_plate.mask
dtype: string
splits:
- name: train
num_bytes: 8236358
num_examples: 10
download_size: 8028800
dataset_size: 8236358
- config_name: UAE_youtube
features:
- name: image
dtype: image
- name: labeled_image
dtype: image
- name: bbox
dtype: string
- name: license_plate.id
dtype: string
- name: license_plate.visibility
dtype: string
- name: license_plate.rows_count
dtype: uint8
- name: license_plate.number
dtype: string
- name: license_plate.serial
dtype: string
- name: license_plate.country
dtype: string
- name: license_plate.mask
dtype: string
splits:
- name: train
num_bytes: 41202317
num_examples: 162
download_size: 2666314
dataset_size: 41202317
---
# License Plates
Over **1.2 million** annotated license plates from vehicles around the world. This dataset is tailored for **License Plate Recognition tasks** and includes images from both YouTube and PlatesMania.
Annotation details are provided in the About section below.
# Get the dataset
### This is just an example of the data
Leave a request on [**https://trainingdata.pro/data-market**](https://trainingdata.pro/data-market/car-license-plates?utm_source=huggingface&utm_medium=cpc&utm_campaign=license_plates) to discuss your requirements, learn about the price and buy the dataset.
# About
## Variables in .csv files:
- **file_name** - filename of the original car photo
- **license_plate.country** - country where the vehicle was captured
- **bbox** - normalized Bounding Box labeling of the car
- **license_plate.visibility** - the visibility type of the license plate
- **license_plate.id** - unique license plate's id
- **license_plate.mask** - normalized coordinates of the license plate
- **license_plate.rows_count** - single-line or double-line number
- **license_plate.number** - recognized text of the license plate
- **license_plate.serial** - only for UAE numbers - license plate series
- **license_plate.region** - only for UAE numbers - license plate subregion
- **license_plate.color** - only for Saudi Arabia - color of the international plate code
**How it works**: *go to the folder of the country, CSV-file contains all labeling information about images located in the subfolder "photos" of the corresponding folder.*
## [**TrainingData**](https://trainingdata.pro/data-market/car-license-plates?utm_source=huggingface&utm_medium=cpc&utm_campaign=license_plates) provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **https://www.kaggle.com/trainingdatapro/datasets**
TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets** |
AnonymousSub/recipe_RL_data_roberta-base | ---
annotations_creators: []
language:
- en
language_creators: []
license: []
multilinguality:
- monolingual
pretty_name: recipe RL roberta base
size_categories: []
source_datasets: []
tags: []
task_categories: []
task_ids: []
---
# Dataset Description
## Structure
- Consists of 5 fields
- Each row corresponds to a policy - sequence of actions, given an initial `<START>` state, and corresponding rewards at each step.
## Fields
`steps`, `step_attn_masks`, `rewards`, `actions`, `dones`
## Field descriptions
- `steps` (List of lists of `Int`s) - tokenized step tokens of all the steps in the policy sequence (here we use the `roberta-base` tokenizer, as `roberta-base` would be used to encode each step of a recipe)
- `step_attn_masks` (List of lists of `Int`s) - Attention masks corresponding to `steps`
- `rewards` (List of `Float`s) - Sequence of rewards (normalized b/w 0 and 1) assigned per step.
- `actions` (List of lists of `Int`s) - Sequence of actions (one-hot encoded, as the action space is discrete). There are `33` different actions possible (we consider the maximum number of steps per recipe = `16`, so the action can vary from `-16` to `+16`; The class label is got by adding 16 to the actual action value)
- `dones` (List of `Bool`) - Sequence of flags, conveying if the work is completed when that step is reached, or not.
## Dataset Size
- Number of rows = `2255673`
- Maximum number of steps per row = `16` |
chansung/test | ---
license: apache-2.0
---
|
Falah/village4kids_2_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 2094
num_examples: 8
download_size: 2965
dataset_size: 2094
---
# Dataset Card for "village4kids_2_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aisc-team-a1/guidelines | ---
license: other
license_name: common-crawl
license_link: LICENSE
task_categories:
- text-generation
language:
- en
pretty_name: Clinical Guidelines
size_categories:
- 10K<n<100K
tags:
- medical
- health
dataset_info:
features:
- name: id
dtype: string
- name: source
dtype: string
- name: title
dtype: string
- name: clean_text
dtype: string
- name: raw_text
dtype: string
- name: url
dtype: string
- name: overview
dtype: string
splits:
- name: train
num_bytes: 865223621
num_examples: 37970
download_size: 424262411
dataset_size: 865223621
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
*This is a dataset repository made for the AISC class at Harvard Medical School. Please find the original dataset repository here: https://huggingface.co/datasets/epfl-llm/guidelines*
### 🎉 **NEW DROP** 🎉 PubMed Guidelines
We just added 1627 clinical guidelines found in PubMed and PubMed Central to the dataset on December 23rd, 2023. Merry Christmas!
# Clinical Guidelines
The Clinical Guidelines corpus is a new dataset of 47K clinical practice guidelines from 17 high-quality online medical sources. This dataset serves as a crucial component of the original training corpus of the [Meditron](https://huggingface.co/epfl-llm/meditron-70b) Large Language Model (LLM). We publicly release a subset of 37K articles from our Guidelines corpus, extracted from 9 of 17 sources that allow content redistribution, namely CCO, CDC, CMA, ICRC, NICE, PubMed, SPOR, WHO and WikiDoc.
You can scrape and clean all 17 guideline sources using our code in [epfLLM/meditron](https://github.com/epfLLM/meditron).
<img width=75% src="sources.png" alt="Sources of Clinical Practice Guidelines" title="CPG sources">
## Dataset Details
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [EPFL LLM Team](https://huggingface.co/epfl-llm)
- **Language(s):** English only
- **License:** [Common Crawl Foundation Terms of Use](https://commoncrawl.org/terms-of-use)
- **Repository:** [epfLLM/meditron](https://github.com/epfLLM/meditron)
- **Paper:** *[MediTron-70B: Scaling Medical Pretraining for Large Language Models](https://arxiv.org/abs/2311.16079)*
- **Knowledge Cutoff**: August 2023
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
The dataset was curated to provide a high-quality collection of clinical practice guidelines (CPGs) for the medical training of LLMs. Our Clinical Guidelines corpus comprises 48,096 articles from 17 globally recognized sources for clinician and patient-directed guidance across high and low-resource settings, multiple medical domains (internal medicine, pediatrics, oncology, infectious disease, etc.) and multiple geographical locations.
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
Clinical practice guidelines are rigorously researched frameworks designed to guide healthcare practitioners and patients in making evidence-based decisions regarding diagnosis, treatment, and management.
They are compiled through a systematic process of collaborative consensus between experts to establish recommendations from the latest evidence on best practices that would maximize benefit in light of practical concerns such as available resources and context. As a super-synthesis of meta-analyses, they sit atop the *evidence pyramid* and form the basis of actionable evidence-based practice.
Clinical guidelines differ based on several factors:
- **Organizational level**: CPGs are produced at various organizational granularities, ranging from global to hospital-level initiatives directed by international professional medical associations to informal consortia, regional or national governmental bodies to individual NGOs and hospitals.
- **Geographic scope**: The geographic scope ranges from global (WHO) to national (CDC, NICE) and regional (Ontario, Melbourne) to institutional (ICRC, Mayo Clinic). This corpus is biased towards English-speaking regions due to its exclusive focus on English content.
- **Resource level**: The corpus also represents health care concerns from high- (Ontario, Melbourne), low- (WHO), and volatile- (ICRC) resource settings.
- **Audience level**: Guidelines also contains a range of technical and conversational vocabulary with target audiences of clinicians or patients (or both), and is sometimes highly specialized within a theme (cancer, pediatrics, infectious disease).
- **Peer-review**: The peer review processes also ranged from UN bodies (WHO), institutional review boards (ICRC), professional associations (AAFP) to publicly crowdsourced knowledge bases (WikiDoc).
- **Document size**: Article length varies widely from very short statements to 100+ page guides.
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
The dataset is sourced from 17 globally recognized medical entities, covering a wide range of healthcare contexts and audiences.
We employed pragmatic selection criteria over medical sources, seeking CPGs that were:
- (1) open-access
- (2) systematically formatted with homogenous textual structure (i.e., in a format in which automated processes could be deployed without excessive risk of misaligning textual sequences)
- (3) in the language predominantly represented by the pre-training corpus of Llama (i.e., English)
- (4) covering a breadth of medical sub-domains, audiences (clinician, nurse, patient), and resource settings (high, low, and humanitarian response settings)
| Source | Full Name | Tag | Guidelines | Words | Audience | Country | Released |
|-|-|-|-|-|-|-|-|
| **[AAFP](https://www.aafp.org)** | American Academy of Family Physicians | `aafp` | 50 | 9.4K | Doctor | USA | No |
| **[CCO](https://www.cancercareontario.ca/en/guidelines-advice)** | Cancer Care Ontario | `cco` | 87 | 199K | Doctor | Canada | **Yes** |
| **[CDC](https://www.cdc.gov/)** | Center for Disease Control and Prevention | `cdc` | 621 | 6.7M | Doctor | USA | **Yes** |
| **[CMA](https://joulecma.ca/)** | Canadian Medical Association | `cma` | 431 | 1.7M | Doctor | Canada | **Yes** |
| **[CPS](https://cps.ca)** | Canadian Paediatric Society | `cps` | 54 | 133K | Doctor | Canada | No |
| **[drugs.com](https://www.drugs.com/)** | Drugs.com | `drugs` | 6548 | 4.1M | Both | International | No |
| **[GuidelineCentral](https://www.guidelinecentral.com/)** | GuidelineCentral | `gc` | 1029 | 1M | Doctor | Mix | No |
| **[ICRC](http://icrc.org/)** | International Committee of the Red Cross | `icrc` | 49 | 1.2M | Doctor | International | **Yes** |
| **[IDSA](https://www.idsociety.org/)** | Infectious Diseases Society of America | `idsa` | 47 | 646K | Doctor | USA | No |
| **[MAGIC](https://magicevidence.org/)** | Making GRADE The Irresistible Choice | `magic` | 52 | 415K | Doctor | Mix | No |
| **[MayoClinic](https://www.mayoclinic.org/)** | MayoClinic | `mayo` | 1100 | 2.2M | Patient | USA | No |
| **[NICE](https://www.nice.org.uk/guidance)** | National Institute for Health and Care Excellence | `nice` | 1656 | 8.1M | Doctor | UK | **Yes** |
| **[PubMed](https://pubmed.ncbi.nlm.nih.gov)** | PubMed | `pubmed` | 1627 | 10.8M | Doctor | Mix | **Yes** |
| **[RCH](https://www.rch.org.au/clinicalguide/about_rch_cpgs/welcome_to_the_clinical_practice_guidelines/)** | Royal Children's Hospital Melbourne | `rch` | 384 | 410K | Doctor | Australia | No |
| **[SPOR](https://sporevidencealliance.ca/key-activities/cpg-asset-map/cpg-database/)** | Strategy for Patient-Oriented Research | `spor` | 217 | 1.1M | Doctor | Canada | **Yes** |
| **[WHO](https://www.who.int/publications/who-guidelines)** | World Health Organization | `who` | 223 | 3.1M | Both | International | **Yes** |
| **[WikiDoc](https://www.wikidoc.org/)** | WikiDoc | `wikidoc` | 33058 | 34M | Both | International | **Yes** |
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
PDF documents were converted to text using [GROBID](https://github.com/kermitt2/grobid).
After extracting the raw text from each source, we cleaned data with an ad-hoc process to exclude irrelevant or repetitive content that did not contribute to the textual content, such as URLs, references, figures, table delimiters, and ill-formatted characters.
This filtering procedure was performed differently for each source using a sample of 50 articles. Please note that this procedure is not perfect, as it may have removed useful information or kept superfluous content. We provide the `raw_text` for each article if you would like to perform your own cleaning step.
Additionally, the text was standardized to a unified format with hierarchical section headers indicated by `'#'`, homogenous spacing `'\n\n'` separating paragraphs, and normalized lists formatted with `'- '` bullet points.
Finally, all samples were deduplicated using title matching, and articles that were too short or not English were filtered out.
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
As the articles are publicly accessible, no personal or sensitive information is included.
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
Each row of the dataset represents one clinical practice guideline article, and consists of the following dataset fields (all strings):
| Field | Description | Sources with field |
|-------------|-------------------------------------------|------------------------------|
| `id` | Unique identifier for each article | All |
| `source` | Source tag (`cco`, `cdc`, `cma`, `icrc`, `nice`, `spor`, `who` or `wikidoc`)| All |
| `title` | Title of the article | CMA, NICE & WikiDoc |
| `url` | URL of the article | NICE, WikiDoc & PubMed |
| `raw_text` | Unprocessed scraped article text | All |
| `clean_text`| Cleaned and formatted article text | All |
| `overview` | Short summary or abstract of the article | NICE & Pubmed |
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
The dataset is intended for use in tasks related to text generation, specifically in the context of clinical practice guidelines. It can be employed for training language models and other natural language processing applications within the healthcare domain.
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
- **Redistribution**: Please always check redistribution licenses before using the content as these may also evolve over time. To the best of our knowledge, we are following the redistribution licensing of each source and we invite users to inform us if that is not the case.
- **Malicious use**: We do not support any use of this corpus that may be harmful. Creating tools that provide clinical advice is commendable, but extremely dangerous if not done with the appropriate care. Such tools need to be validated for safety and utility by medical professionals in randomized controlled trials. i.e. please do not create cowboy health apps that fool vulnerable users into thinking they are receiving validated advice.
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
- **Peer-Review Quality**: It is important to understand that while most sources are validated by internationally endorsed professional associations, a large proportion of articles are from Wikidoc which contains crowdsourced content. While edits in Wikidoc are generally restricted to expert review, the process of consensus and oversight is different from the traditional rigor of clinical guidelines.
- **Representation**: This corpus is in English, and over-represents English-speaking regions. While we have included WHO and ICRC guidelines for low-resource settings, further work needs to be done to scrape sources from diverse contexts.
- **Temporal scope**: Guidelines are constantly updated and these represent a snapshot of each in August 2023. Please re-scrape for updated content.
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
We warmly invite users to help us build a more representative corpus with high-quality peer-reviewed clinical practice guidelines in various languages and representing the full scope of clinical specialties and geographic regions.
We encourage users of this content to be mindful of its current limitations in temporal and geographic scope and we repeat our warning: creating tools that provide clinical advice is commendable, but extremely dangerous if not done with the appropriate care. Such tools need to be validated for safety and utility by medical professionals in randomized controlled trials. i.e. Please don’t create cowboy health apps that fool vulnerable users into thinking they are receiving validated advice.
## Acknowledgments
The availability of open-access clinical practice guidelines (CPG) was critical to this work, and we thank all the societies listed above. A broader representation of geography, medical specialties, and contexts (especially low-resource settings) could be achieved through more standardized CPG formatting practices to ensure reliable textual extraction (e.g., releasing `.txt` or `.html` versions with structured content). We encourage the CPG community to continue to make these documents available (open-access with permissive licenses for incorporation into large language models) and easily usable.
## Authors
- **Curation**: Mary-Anne Hartley
- **Scraping**: Antoine Bonnet, Alexandre Sallinen, Igor Krawczuk, Kyle Matoba
- **Cleaning**: Antoine Bonnet, Alexandre Sallinen
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
If you use the Clinical Guidelines corpus, please cite out work:
```
@misc{chen2023meditron70b,
title={MEDITRON-70B: Scaling Medical Pretraining for Large Language Models},
author={Zeming Chen and Alejandro Hernández-Cano and Angelika Romanou and Antoine Bonnet and Kyle Matoba and Francesco Salvi and Matteo Pagliardini and Simin Fan and Andreas Köpf and Amirkeivan Mohtashami and Alexandre Sallinen and Alireza Sakhaeirad and Vinitra Swamy and Igor Krawczuk and Deniz Bayazit and Axel Marmet and Syrielle Montariol and Mary-Anne Hartley and Martin Jaggi and Antoine Bosselut},
year={2023},
eprint={2311.16079},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
@software{epfmedtrn,
author = {Zeming Chen and Alejandro Hernández-Cano and Angelika Romanou and Antoine Bonnet and Kyle Matoba and Francesco Salvi and Matteo Pagliardini and Simin Fan and Andreas Köpf and Amirkeivan Mohtashami and Alexandre Sallinen and Alireza Sakhaeirad and Vinitra Swamy and Igor Krawczuk and Deniz Bayazit and Axel Marmet and Syrielle Montariol and Mary-Anne Hartley and Martin Jaggi and Antoine Bosselut},
title = {MediTron-70B: Scaling Medical Pretraining for Large Language Models},
month = November,
year = 2023,
url = {https://github.com/epfLLM/meditron}
}
```
|
llm-aes/meva_full_rate_explain | ---
dataset_info:
features:
- name: task_id
dtype: string
- name: worker_id
dtype: string
- name: human_label
dtype: int64
- name: llm_label
dtype: int64
- name: generator_1
dtype: string
- name: generator_2
dtype: string
- name: premise
dtype: string
splits:
- name: train
num_bytes: 391250
num_examples: 2000
download_size: 49350
dataset_size: 391250
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/rumi_bluearchive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of rumi/朱城ルミ/瑠美 (Blue Archive)
This is the dataset of rumi/朱城ルミ/瑠美 (Blue Archive), containing 207 images and their tags.
The core tags of this character are `brown_hair, animal_ears, long_hair, breasts, fox_ears, halo, hair_between_eyes, large_breasts, purple_eyes, crossed_bangs, red_halo, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 207 | 356.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rumi_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 207 | 300.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rumi_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 520 | 598.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rumi_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/rumi_bluearchive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, black_skirt, blush, brown_pantyhose, looking_at_viewer, smile, solo, shirt, side_slit, simple_background, white_background, hand_on_own_hip, open_mouth, thigh_strap, black_pantyhose, gourd, rope_belt, short_sleeves, thighs |
| 1 | 9 |  |  |  |  |  | 1girl, black_skirt, pantyhose, smile, solo, blush, looking_at_viewer, shirt, holding, sweat, steam |
| 2 | 9 |  |  |  |  |  | 1girl, blush, solo, covered_nipples, looking_at_viewer, simple_background, smile, sweat, white_background, red_ascot, black_skirt, wet_shirt, see-through_shirt, upper_body |
| 3 | 8 |  |  |  |  |  | 1girl, blush, looking_at_viewer, sweat, bra_visible_through_clothes, see-through_shirt, solo, wet_shirt, simple_background, black_bra, black_skirt, hand_on_own_hip, open_mouth, smile, steaming_body, high-waist_skirt, pantyhose, short_sleeves |
| 4 | 8 |  |  |  |  |  | 1girl, looking_at_viewer, solo, blush, simple_background, white_background, one_eye_closed, chinese_clothes, black_pantyhose, sweat, thigh_strap, grin, sitting, upper_body |
| 5 | 16 |  |  |  |  |  | 1boy, 1girl, blush, solo_focus, smile, looking_at_viewer, penis, nipples, pov, breasts_squeezed_together, cum_on_breasts, sweat, censored, huge_breasts, open_mouth, paizuri_under_clothes, shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_skirt | blush | brown_pantyhose | looking_at_viewer | smile | solo | shirt | side_slit | simple_background | white_background | hand_on_own_hip | open_mouth | thigh_strap | black_pantyhose | gourd | rope_belt | short_sleeves | thighs | pantyhose | holding | sweat | steam | covered_nipples | red_ascot | wet_shirt | see-through_shirt | upper_body | bra_visible_through_clothes | black_bra | steaming_body | high-waist_skirt | one_eye_closed | chinese_clothes | grin | sitting | 1boy | solo_focus | penis | nipples | pov | breasts_squeezed_together | cum_on_breasts | censored | huge_breasts | paizuri_under_clothes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:--------|:------------------|:--------------------|:--------|:-------|:--------|:------------|:--------------------|:-------------------|:------------------|:-------------|:--------------|:------------------|:--------|:------------|:----------------|:---------|:------------|:----------|:--------|:--------|:------------------|:------------|:------------|:--------------------|:-------------|:------------------------------|:------------|:----------------|:-------------------|:-----------------|:------------------|:-------|:----------|:-------|:-------------|:--------|:----------|:------|:----------------------------|:-----------------|:-----------|:---------------|:------------------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | X | | X | X | X | X | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | X | | X | X | X | | | X | X | | | | | | | | | | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | X | X | | X | X | X | | | X | | X | X | | | | | X | | X | | X | | | | X | X | | X | X | X | X | | | | | | | | | | | | | | |
| 4 | 8 |  |  |  |  |  | X | | X | | X | | X | | | X | X | | | X | X | | | | | | | X | | | | | | X | | | | | X | X | X | X | | | | | | | | | | |
| 5 | 16 |  |  |  |  |  | X | | X | | X | X | | X | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
|
djfelipe/vocall | ---
license: openrail
---
|
1aurent/RxRx1 | ---
license: cc-by-4.0
size_categories:
- 100K<n<1M
task_categories:
- image-classification
tags:
- biology
- drug
- cells
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype:
array3_d:
dtype: uint8
shape:
- 512
- 512
- 6
- name: site_id
dtype: string
- name: well_id
dtype: string
- name: cell_type
dtype: string
- name: experiment
dtype: string
- name: plate
dtype: int32
- name: well
dtype: string
- name: site
dtype: int32
- name: well_type
dtype:
class_label:
names:
'0': treatment
'1': positive_control
'2': negative_control
- name: sirna
dtype: string
- name: sirna_id
dtype: int32
- name: embeddings
sequence: float32
length: 128
splits:
- name: train
num_bytes: 213139738276
num_examples: 81224
- name: test
num_bytes: 116210798412
num_examples: 44286
dataset_size: 329350536688
---
[](https://doi.org/10.48550/arXiv.2301.05768)
# RxRx1: A Dataset for Evaluating Experimental Batch Correction Methods

**Homepage**: https://www.rxrx.ai/rxrx1 \
**Publication Date**: 2019-06 \
**License**: [Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0)](https://creativecommons.org/licenses/by-nc-sa/4.0/legalcode) \
**Citation**:
```bibtex
@misc{sypetkowski2023rxrx1,
title = {RxRx1: A Dataset for Evaluating Experimental Batch Correction Methods},
author = {Maciej Sypetkowski and Morteza Rezanejad and Saber Saberian and Oren Kraus and John Urbanik and James Taylor and Ben Mabey and Mason Victors and Jason Yosinski and Alborz Rezazadeh Sereshkeh and Imran Haque and Berton Earnshaw},
year = {2023},
eprint = {2301.05768},
archiveprefix = {arXiv},
primaryclass = {cs.CV}
}
```
## Description
High-throughput screening techniques are commonly used to obtain large quantities of data in many fields of biology. It is well known that artifacts arising from variability in the technical execution of different experimental batches within such screens confound these observations and can lead to invalid biological conclusions. It is therefore necessary to account for these batch effects when analyzing outcomes. In this paper we describe RxRx1, a biological dataset designed specifically for the systematic study of batch effect correction methods. The dataset consists of 125,510 high-resolution fluorescence microscopy images of human cells under 1,138 genetic perturbations in 51 experimental batches across 4 cell types. Visual inspection of the images alone clearly demonstrates significant batch effects. We propose a classification task designed to evaluate the effectiveness of experimental batch correction methods on these images and examine the performance of a number of correction methods on this task. Our goal in releasing RxRx1 is to encourage the development of effective experimental batch correction methods that generalize well to unseen experimental batches.
|
open-llm-leaderboard/details_FelixChao__Gemma-10.2B-Coder | ---
pretty_name: Evaluation run of FelixChao/Gemma-10.2B-Coder
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [FelixChao/Gemma-10.2B-Coder](https://huggingface.co/FelixChao/Gemma-10.2B-Coder)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__Gemma-10.2B-Coder\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-21T14:06:19.977620](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__Gemma-10.2B-Coder/blob/main/results_2024-03-21T14-06-19.977620.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6193276450638524,\n\
\ \"acc_stderr\": 0.032595355839530035,\n \"acc_norm\": 0.6224484569319588,\n\
\ \"acc_norm_stderr\": 0.03325145393228843,\n \"mc1\": 0.34394124847001223,\n\
\ \"mc1_stderr\": 0.016629087514276775,\n \"mc2\": 0.5243704066589732,\n\
\ \"mc2_stderr\": 0.015065637058082886\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5597269624573379,\n \"acc_stderr\": 0.014506769524804246,\n\
\ \"acc_norm\": 0.5870307167235495,\n \"acc_norm_stderr\": 0.014388344935398326\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.617307309300936,\n\
\ \"acc_stderr\": 0.004850508945116089,\n \"acc_norm\": 0.8203545110535749,\n\
\ \"acc_norm_stderr\": 0.0038310732859630774\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5333333333333333,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.5333333333333333,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952929,\n\
\ \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952929\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6264150943396226,\n \"acc_stderr\": 0.029773082713319878,\n\
\ \"acc_norm\": 0.6264150943396226,\n \"acc_norm_stderr\": 0.029773082713319878\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566016,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566016\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \
\ \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\"\
: 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.03724249595817729,\n\
\ \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.03724249595817729\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n\
\ \"acc_stderr\": 0.04858083574266346,\n \"acc_norm\": 0.39215686274509803,\n\
\ \"acc_norm_stderr\": 0.04858083574266346\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\
\ 0.6085106382978723,\n \"acc_stderr\": 0.03190701242326812,\n \"\
acc_norm\": 0.6085106382978723,\n \"acc_norm_stderr\": 0.03190701242326812\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.04043461861916747,\n\
\ \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.04043461861916747\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055273,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055273\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7387096774193549,\n \"acc_stderr\": 0.024993053397764812,\n \"\
acc_norm\": 0.7387096774193549,\n \"acc_norm_stderr\": 0.024993053397764812\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n \"\
acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.0352439084451178,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.0352439084451178\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8232323232323232,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.8232323232323232,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709443,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5717948717948718,\n \"acc_stderr\": 0.025088301454694834,\n\
\ \"acc_norm\": 0.5717948717948718,\n \"acc_norm_stderr\": 0.025088301454694834\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473072,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473072\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.03095663632856655,\n \
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.03095663632856655\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8018348623853211,\n \"acc_stderr\": 0.017090573804217905,\n \"\
acc_norm\": 0.8018348623853211,\n \"acc_norm_stderr\": 0.017090573804217905\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251735,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251735\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467765,\n\
\ \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467765\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8429752066115702,\n \"acc_stderr\": 0.03321244842547129,\n \"\
acc_norm\": 0.8429752066115702,\n \"acc_norm_stderr\": 0.03321244842547129\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.04738975119274155,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.04738975119274155\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n\
\ \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n\
\ \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n\
\ \"acc_stderr\": 0.013964393769899136,\n \"acc_norm\": 0.8122605363984674,\n\
\ \"acc_norm_stderr\": 0.013964393769899136\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.02599247202930639,\n\
\ \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.02599247202930639\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3743016759776536,\n\
\ \"acc_stderr\": 0.016185444179457175,\n \"acc_norm\": 0.3743016759776536,\n\
\ \"acc_norm_stderr\": 0.016185444179457175\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.026493033225145894,\n\
\ \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.026493033225145894\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n\
\ \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n\
\ \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495026,\n\
\ \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495026\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44654498044328556,\n\
\ \"acc_stderr\": 0.012697046024399687,\n \"acc_norm\": 0.44654498044328556,\n\
\ \"acc_norm_stderr\": 0.012697046024399687\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.030332578094555026,\n\
\ \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.030332578094555026\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712845,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712845\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.027403859410786848,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.027403859410786848\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.037752516806863715,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.037752516806863715\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.03878626771002361,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.03878626771002361\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368032,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368032\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34394124847001223,\n\
\ \"mc1_stderr\": 0.016629087514276775,\n \"mc2\": 0.5243704066589732,\n\
\ \"mc2_stderr\": 0.015065637058082886\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7837411207576953,\n \"acc_stderr\": 0.011570614861409357\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5011372251705838,\n \
\ \"acc_stderr\": 0.013772449096346838\n }\n}\n```"
repo_url: https://huggingface.co/FelixChao/Gemma-10.2B-Coder
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|arc:challenge|25_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|gsm8k|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hellaswag|10_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T14-06-19.977620.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T14-06-19.977620.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- '**/details_harness|winogrande|5_2024-03-21T14-06-19.977620.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-21T14-06-19.977620.parquet'
- config_name: results
data_files:
- split: 2024_03_21T14_06_19.977620
path:
- results_2024-03-21T14-06-19.977620.parquet
- split: latest
path:
- results_2024-03-21T14-06-19.977620.parquet
---
# Dataset Card for Evaluation run of FelixChao/Gemma-10.2B-Coder
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [FelixChao/Gemma-10.2B-Coder](https://huggingface.co/FelixChao/Gemma-10.2B-Coder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FelixChao__Gemma-10.2B-Coder",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-21T14:06:19.977620](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__Gemma-10.2B-Coder/blob/main/results_2024-03-21T14-06-19.977620.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6193276450638524,
"acc_stderr": 0.032595355839530035,
"acc_norm": 0.6224484569319588,
"acc_norm_stderr": 0.03325145393228843,
"mc1": 0.34394124847001223,
"mc1_stderr": 0.016629087514276775,
"mc2": 0.5243704066589732,
"mc2_stderr": 0.015065637058082886
},
"harness|arc:challenge|25": {
"acc": 0.5597269624573379,
"acc_stderr": 0.014506769524804246,
"acc_norm": 0.5870307167235495,
"acc_norm_stderr": 0.014388344935398326
},
"harness|hellaswag|10": {
"acc": 0.617307309300936,
"acc_stderr": 0.004850508945116089,
"acc_norm": 0.8203545110535749,
"acc_norm_stderr": 0.0038310732859630774
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.03639057569952929,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.03639057569952929
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6264150943396226,
"acc_stderr": 0.029773082713319878,
"acc_norm": 0.6264150943396226,
"acc_norm_stderr": 0.029773082713319878
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566016,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.03724249595817729,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.03724249595817729
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266346,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6085106382978723,
"acc_stderr": 0.03190701242326812,
"acc_norm": 0.6085106382978723,
"acc_norm_stderr": 0.03190701242326812
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.04043461861916747,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.04043461861916747
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055273,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055273
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7387096774193549,
"acc_stderr": 0.024993053397764812,
"acc_norm": 0.7387096774193549,
"acc_norm_stderr": 0.024993053397764812
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.0352439084451178,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.0352439084451178
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8232323232323232,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.8232323232323232,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709443,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5717948717948718,
"acc_stderr": 0.025088301454694834,
"acc_norm": 0.5717948717948718,
"acc_norm_stderr": 0.025088301454694834
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473072,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473072
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.03095663632856655,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.03095663632856655
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8018348623853211,
"acc_stderr": 0.017090573804217905,
"acc_norm": 0.8018348623853211,
"acc_norm_stderr": 0.017090573804217905
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251735,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251735
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467765,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467765
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8429752066115702,
"acc_stderr": 0.03321244842547129,
"acc_norm": 0.8429752066115702,
"acc_norm_stderr": 0.03321244842547129
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.04738975119274155,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.04738975119274155
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8122605363984674,
"acc_stderr": 0.013964393769899136,
"acc_norm": 0.8122605363984674,
"acc_norm_stderr": 0.013964393769899136
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.02599247202930639,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.02599247202930639
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3743016759776536,
"acc_stderr": 0.016185444179457175,
"acc_norm": 0.3743016759776536,
"acc_norm_stderr": 0.016185444179457175
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.026493033225145894,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.026493033225145894
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6527331189710611,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.6527331189710611,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495026,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495026
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44654498044328556,
"acc_stderr": 0.012697046024399687,
"acc_norm": 0.44654498044328556,
"acc_norm_stderr": 0.012697046024399687
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5257352941176471,
"acc_stderr": 0.030332578094555026,
"acc_norm": 0.5257352941176471,
"acc_norm_stderr": 0.030332578094555026
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.02853556033712845,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.02853556033712845
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786848,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786848
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.037752516806863715,
"acc_norm": 0.83,
"acc_norm_stderr": 0.037752516806863715
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.03878626771002361,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.03878626771002361
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368032,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368032
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34394124847001223,
"mc1_stderr": 0.016629087514276775,
"mc2": 0.5243704066589732,
"mc2_stderr": 0.015065637058082886
},
"harness|winogrande|5": {
"acc": 0.7837411207576953,
"acc_stderr": 0.011570614861409357
},
"harness|gsm8k|5": {
"acc": 0.5011372251705838,
"acc_stderr": 0.013772449096346838
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
autoevaluate/autoeval-eval-multi_news-default-e22c67-2252871792 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- multi_news
eval_info:
task: summarization
model: pszemraj/long-t5-tglobal-xl-16384-book-summary
metrics: []
dataset_name: multi_news
dataset_config: default
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/long-t5-tglobal-xl-16384-book-summary
* Dataset: multi_news
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model. |
Falah/arabic_modern_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 44164932
num_examples: 100000
download_size: 4499641
dataset_size: 44164932
---
# Dataset Card for "arabic_modern_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
leomiranda02/vozcaiomartins | ---
license: openrail
---
|
joey234/mmlu-human_aging-neg-prepend-verbal | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
- name: neg_prompt
dtype: string
- name: fewshot_context_neg
dtype: string
- name: fewshot_context_ori
dtype: string
splits:
- name: dev
num_bytes: 6203
num_examples: 5
- name: test
num_bytes: 1393366
num_examples: 223
download_size: 173053
dataset_size: 1399569
---
# Dataset Card for "mmlu-human_aging-neg-prepend-verbal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ibranze/araproje_arc_tr_dynamic | ---
dataset_info:
features:
- name: keys
dtype: string
- name: values
sequence: string
splits:
- name: train
num_bytes: 126841
num_examples: 250
download_size: 12407
dataset_size: 126841
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Biomedical-TeMU/SPACCC_Tokenizer | ---
license: cc-by-4.0
---
# The Tokenizer for Clinical Cases Written in Spanish
## Introduction
This repository contains the tokenization model trained using the SPACCC_TOKEN corpus (https://github.com/PlanTL-SANIDAD/SPACCC_TOKEN). The model was trained using the 90% of the corpus (900 clinical cases) and tested against the 10% (100 clinical cases). This model is a great resource to tokenize biomedical documents, specially clinical cases written in Spanish.
This model was created using the Apache OpenNLP machine learning toolkit (https://opennlp.apache.org/), with the release number 1.8.4, released in December 2017.
This repository contains the training set, testing set, Gold Standard.
## Prerequisites
This software has been compiled with Java SE 1.8 and it should work with recent versions. You can download Java from the following website: https://www.java.com/en/download
The executable file already includes the Apache OpenNLP dependencies inside, so the download of this toolkit is not necessary. However, you may download the latest version from this website: https://opennlp.apache.org/download.html
The library file we have used to compile is "opennlp-tools-1.8.4.jar". The source code should be able to compile with the latest version of OpenNLP, "opennlp-tools-*RELEASE_NUMBER*.jar". In case there are compilation or execution errors, please let us know and we will make all the necessary updates.
## Directory structure
<pre>
exec/
An executable file that can be used to apply the tokenization to your documents.
You can find the notes about its execution below in section "Usage".
gold_standard/
The clinical cases used as gold standard to evaluate the model's performance.
model/
The tokenizationint model, "es-tokenization-model-spaccc.bin", a binary file.
src/
The source code to create the model (CreateModelTok.java) and evaluate it (EvaluateModelTok.java).
The directory includes an example about how to use the model inside your code (Tokenization.java).
File "abbreviations.dat" contains a list of abbreviations, essential to build the model.
test_set/
The clinical cases used as test set to evaluate the model's performance.
train_set/
The clinical cases used to build the model. We use a single file with all documents present in
directory "train_set_docs" concatented.
train_set_docs/
The clinical cases used to build the model. For each record the sentences are already splitted.
</pre>
## Usage
The executable file *Tokenizer.jar* is the program you need to tokenize the text in your document. For this program, two arguments are needed: (1) the text file to tokenize, and (2) the model file (*es-tokenization-model-spaccc.bin*). The program will display all tokens in the terminal, with one token per line.
From the `exec` folder, type the following command in your terminal:
<pre>
$ java -jar Tokenizer.jar INPUT_FILE MODEL_FILE
</pre>
## Examples
Assuming you have the executable file, the input file and the model file in the same directory:
<pre>
$ java -jar Tokenizer.jar file.txt es-tokenizer-model-spaccc.bin
</pre>
## Model creation
To create this tokenization model, we used the following training parameters (class *TrainingParameters* in OpenNLP) to get the best performance:
- Number of iterations: 1500.
- Cutoff parameter: 4.
- Trainer type parameter: *EventTrainer.EVENT_VALUE*.
- Algorithm: Maximum Entropy (*ModelType.MAXENT.name()*).
Meanwhile, we used the following parameters for the tokenizer builder (class *TokenizerFactory* in OpenNLP) to get the best performance:
- Language code: *es* (for Spanish).
- Abbreviation dictionary: file "abbreviations.dat" (included in the `src/` directory).
- Use alphanumeric optimization: false
- Alphanumeric pattern: null
## Model evaluation
After tuning the model using different values for each parameter mentioned above, we got the best performance with the values mentioned above.
| | Value |
| ----------------------------------------: | :------ |
| Number of tokens in the gold standard | 38247 |
| Number of tokens generated | 38227 |
| Number of words correctly tokenized | 38182 |
| Number of words wrongly tokenized | 35 |
| Number of tokens missed | 30 |
| **Precision** | **99.88%** |
| **Recall** | **99.83%** |
| **F-Measure** | **99.85%**|
Table 1: Evaluation statistics for the tokenization model.
## Contact
Ander Intxaurrondo (ander.intxaurrondo@bsc.es)
## License
<a rel="license" href="http://creativecommons.org/licenses/by/4.0/"><img alt="Creative Commons License" style="border-width:0" src="https://i.creativecommons.org/l/by/4.0/88x31.png" /></a><br />This work is licensed under a <a rel="license" href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>.
Copyright (c) 2018 Secretaría de Estado para el Avance Digital (SEAD)
|
mHossain/final_train_v2_450000 | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: input_text
dtype: string
- name: target_text
dtype: string
- name: prefix
dtype: string
splits:
- name: train
num_bytes: 9110645.1
num_examples: 27000
- name: test
num_bytes: 1012293.9
num_examples: 3000
download_size: 4438870
dataset_size: 10122939.0
---
# Dataset Card for "final_train_v2_450000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
judgeou/nai3-artist-collection | ---
license: cc-by-sa-4.0
---
|
manojpatil/training | ---
language:
- en
size_categories:
- 1K<n<10K
task_categories:
- summarization
tags:
- legal
pretty_name: abc
--- |
CyberHarem/furutaka_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of furutaka/古鷹/古鹰 (Azur Lane)
This is the dataset of furutaka/古鷹/古鹰 (Azur Lane), containing 34 images and their tags.
The core tags of this character are `long_hair, brown_hair, breasts, blue_eyes, large_breasts, animal_ears, bangs, between_breasts, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 34 | 36.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/furutaka_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 34 | 23.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/furutaka_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 90 | 52.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/furutaka_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 34 | 33.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/furutaka_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 90 | 70.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/furutaka_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/furutaka_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 34 |  |  |  |  |  | 1girl, retrofit_(azur_lane), solo, detached_sleeves, looking_at_viewer, pleated_skirt, sailor_collar, blush, navel, midriff, thighhighs, crop_top, simple_background, black_skirt, miniskirt, open_mouth, gloves, white_background, wide_sleeves, smile, armpits |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | retrofit_(azur_lane) | solo | detached_sleeves | looking_at_viewer | pleated_skirt | sailor_collar | blush | navel | midriff | thighhighs | crop_top | simple_background | black_skirt | miniskirt | open_mouth | gloves | white_background | wide_sleeves | smile | armpits |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------------|:-------|:-------------------|:--------------------|:----------------|:----------------|:--------|:--------|:----------|:-------------|:-----------|:--------------------|:--------------|:------------|:-------------|:---------|:-------------------|:---------------|:--------|:----------|
| 0 | 34 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
AdapterOcean/biology_dataset_standardized_cluster_3_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 22853614
num_examples: 7463
download_size: 0
dataset_size: 22853614
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "biology_dataset_standardized_cluster_3_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/textvqa_valid_google_flan_t5_xxl_mode_OCR_VQA_Q_rices_ns_5000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0
num_bytes: 934914
num_examples: 5000
download_size: 334889
dataset_size: 934914
configs:
- config_name: default
data_files:
- split: fewshot_0
path: data/fewshot_0-*
---
|
benayas/snips_chatgpt_20pct_v2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1053502
num_examples: 13084
download_size: 423959
dataset_size: 1053502
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
TheFinAI/en-acronym | ---
dataset_info:
features:
- name: query
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
splits:
- name: test
num_bytes: 142136
num_examples: 1527
download_size: 48264
dataset_size: 142136
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
unknownX/Eddie_cartoon | ---
license: other
---
|
ProfQu/MinecraftRecipes | ---
license: mit
---
# MinecraftRecipes
This dataset contains all crafting table recipes from Minecraft, the recipes were taken straight from the Minecraft source and put here.
The data was generated from [this repository](https://github.com/ProfessorQu/MCCraftingRecipes). |
CyberHarem/nanao_yuriko_theidolmstermillionlive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of nanao_yuriko/七尾百合子/나나오유리코 (THE iDOLM@STER: Million Live!)
This is the dataset of nanao_yuriko/七尾百合子/나나오유리코 (THE iDOLM@STER: Million Live!), containing 500 images and their tags.
The core tags of this character are `blue_hair, yellow_eyes, short_hair, breasts, braid, bangs, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 568.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nanao_yuriko_theidolmstermillionlive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 356.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nanao_yuriko_theidolmstermillionlive/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1212 | 759.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nanao_yuriko_theidolmstermillionlive/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 514.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nanao_yuriko_theidolmstermillionlive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1212 | 1.00 GiB | [Download](https://huggingface.co/datasets/CyberHarem/nanao_yuriko_theidolmstermillionlive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nanao_yuriko_theidolmstermillionlive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 25 |  |  |  |  |  | 1girl, solo, smile, looking_at_viewer, blush, open_mouth, book |
| 1 | 26 |  |  |  |  |  | 1girl, solo, blush, looking_at_viewer, open_mouth, frilled_bikini, navel, hair_flower, smile, cleavage, outdoors, collarbone, green_bikini |
| 2 | 18 |  |  |  |  |  | 1boy, 1girl, blush, hetero, open_mouth, solo_focus, nipples, pussy, sex, vaginal, penis, lying, navel, completely_nude, mosaic_censoring, spread_legs |
| 3 | 5 |  |  |  |  |  | 1boy, 1girl, blush, fellatio, hetero, nude, penis, solo_focus, looking_at_viewer, pov, ass, mosaic_censoring, cum_in_mouth |
| 4 | 9 |  |  |  |  |  | 1girl, serafuku, solo, looking_at_viewer, white_shirt, blush, navel, pleated_skirt, red_neckerchief, short_sleeves, fingerless_gloves, white_background, white_skirt, black_gloves, cape, closed_mouth, midriff, simple_background, black_thighhighs, medium_hair, white_sailor_collar |
| 5 | 7 |  |  |  |  |  | 1girl, blush, collarbone, solo, cleavage, upper_body, looking_at_viewer, simple_background, white_background, armpits, arms_up, bow_bra, navel, small_breasts, smile, underwear_only |
| 6 | 8 |  |  |  |  |  | 1girl, blush, looking_at_viewer, open_mouth, solo, :d, bow, detached_sleeves, white_dress, bare_shoulders, collarbone, frilled_dress, hair_ribbon, puffy_short_sleeves, sleeveless_dress, yellow_ribbon, blue_sky, day, mini_hat, outdoors, sailor_dress, white_headwear, wrist_cuffs, argyle, choker, cloud, ribbon_braid, sparkle, standing, tilted_headwear, white_sailor_collar, belt_buckle, blue_thighhighs, feathers, holding, outstretched_arm, pleated_dress, shiny_hair, white_background, white_sleeves |
| 7 | 6 |  |  |  |  |  | 1girl, detached_collar, looking_at_viewer, rabbit_ears, solo, wrist_cuffs, cleavage, fake_animal_ears, playboy_bunny, bare_shoulders, black_bowtie, blush, sitting, strapless_leotard, black_pantyhose, closed_mouth, covered_navel, cowboy_shot, holding_tray, rabbit_tail, smile, wine_glass |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | looking_at_viewer | blush | open_mouth | book | frilled_bikini | navel | hair_flower | cleavage | outdoors | collarbone | green_bikini | 1boy | hetero | solo_focus | nipples | pussy | sex | vaginal | penis | lying | completely_nude | mosaic_censoring | spread_legs | fellatio | nude | pov | ass | cum_in_mouth | serafuku | white_shirt | pleated_skirt | red_neckerchief | short_sleeves | fingerless_gloves | white_background | white_skirt | black_gloves | cape | closed_mouth | midriff | simple_background | black_thighhighs | medium_hair | white_sailor_collar | upper_body | armpits | arms_up | bow_bra | small_breasts | underwear_only | :d | bow | detached_sleeves | white_dress | bare_shoulders | frilled_dress | hair_ribbon | puffy_short_sleeves | sleeveless_dress | yellow_ribbon | blue_sky | day | mini_hat | sailor_dress | white_headwear | wrist_cuffs | argyle | choker | cloud | ribbon_braid | sparkle | standing | tilted_headwear | belt_buckle | blue_thighhighs | feathers | holding | outstretched_arm | pleated_dress | shiny_hair | white_sleeves | detached_collar | rabbit_ears | fake_animal_ears | playboy_bunny | black_bowtie | sitting | strapless_leotard | black_pantyhose | covered_navel | cowboy_shot | holding_tray | rabbit_tail | wine_glass |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------------|:--------|:-------------|:-------|:-----------------|:--------|:--------------|:-----------|:-----------|:-------------|:---------------|:-------|:---------|:-------------|:----------|:--------|:------|:----------|:--------|:--------|:------------------|:-------------------|:--------------|:-----------|:-------|:------|:------|:---------------|:-----------|:--------------|:----------------|:------------------|:----------------|:--------------------|:-------------------|:--------------|:---------------|:-------|:---------------|:----------|:--------------------|:-------------------|:--------------|:----------------------|:-------------|:----------|:----------|:----------|:----------------|:-----------------|:-----|:------|:-------------------|:--------------|:-----------------|:----------------|:--------------|:----------------------|:-------------------|:----------------|:-----------|:------|:-----------|:---------------|:-----------------|:--------------|:---------|:---------|:--------|:---------------|:----------|:-----------|:------------------|:--------------|:------------------|:-----------|:----------|:-------------------|:----------------|:-------------|:----------------|:------------------|:--------------|:-------------------|:----------------|:---------------|:----------|:--------------------|:------------------|:----------------|:--------------|:---------------|:--------------|:-------------|
| 0 | 25 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 26 |  |  |  |  |  | X | X | X | X | X | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 18 |  |  |  |  |  | X | | | | X | X | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | | X | X | | | | | | | | | | X | X | X | | | | | X | | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | X | | X | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 7 |  |  |  |  |  | X | X | X | X | X | | | | X | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 8 |  |  |  |  |  | X | X | | X | X | X | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | X | X | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Lo/adapt-pre-trained-VL-models-to-text-data-LXMERT-finetune | ---
language:
- en
license:
- mit
multilinguality:
- monolingual
---
The LXMERT text finetune data used to train visual features for the adaption of vision-and-language models to text-only tasks in the paper "How to Adapt Pre-trained Vision-and-Language Models to a Text-only Input?".
The data has been created from the data made available by the [LXMERT repo](https://github.com/airsplay/lxmert).
|
RiniPL/Dementia_Dataset | ---
license: ecl-2.0
task_categories:
- image-classification
language:
- en
tags:
- code
pretty_name: Dementia
--- |
rbeauchamp/augmented_images_perplexity | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: image_path
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 1000689543.812
num_examples: 1052
download_size: 1001267552
dataset_size: 1000689543.812
---
# Dataset Card for "augmented_images_perplexity"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bezzam/DigiCam-Mirflickr-MultiMask-10K | ---
license: mit
dataset_info:
features:
- name: lensless
dtype: image
- name: lensed
dtype: image
- name: mask_label
dtype: int64
splits:
- name: train
num_bytes: 4048333123.5
num_examples: 8500
- name: test
num_bytes: 716310780.5
num_examples: 1500
download_size: 4765086529
dataset_size: 4764643904.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Denissilva88/NERI | ---
license: openrail
---
|
serbog/esco_occupations_details_multilingual | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: el
struct:
- name: alternativeLabel
sequence: string
- name: description
dtype: string
- name: preferredLabel
dtype: string
- name: preferredTerm
dtype: string
- name: lt
struct:
- name: alternativeLabel
sequence: string
- name: description
dtype: string
- name: preferredLabel
dtype: string
- name: preferredTerm
dtype: string
- name: code
dtype: string
- name: uk
struct:
- name: preferredLabel
dtype: string
- name: preferredTerm
dtype: string
- name: ga
struct:
- name: alternativeLabel
sequence: string
- name: description
dtype: string
- name: preferredLabel
dtype: string
- name: preferredTerm
dtype: string
- name: sv
struct:
- name: alternativeLabel
sequence: string
- name: description
dtype: string
- name: preferredLabel
dtype: string
- name: preferredTerm
dtype: string
- name: cs
struct:
- name: alternativeLabel
sequence: string
- name: description
dtype: string
- name: preferredLabel
dtype: string
- name: preferredTerm
dtype: string
- name: bg
struct:
- name: alternativeLabel
sequence: string
- name: description
dtype: string
- name: preferredLabel
dtype: string
- name: preferredTerm
dtype: string
- name: 'no'
struct:
- name: alternativeLabel
sequence: string
- name: description
dtype: string
- name: preferredLabel
dtype: string
- name: preferredTerm
dtype: string
- name: en
struct:
- name: alternativeLabel
sequence: string
- name: description
dtype: string
- name: preferredLabel
dtype: string
- name: preferredTerm
dtype: string
- name: lv
struct:
- name: alternativeLabel
sequence: string
- name: description
dtype: string
- name: preferredLabel
dtype: string
- name: preferredTerm
dtype: string
- name: ar
struct:
- name: alternativeLabel
sequence: string
- name: description
dtype: string
- name: preferredLabel
dtype: string
- name: preferredTerm
dtype: string
- name: es
struct:
- name: alternativeLabel
sequence: string
- name: description
dtype: string
- name: preferredLabel
dtype: string
- name: preferredTerm
dtype: string
- name: et
struct:
- name: alternativeLabel
sequence: string
- name: description
dtype: string
- name: preferredLabel
dtype: string
- name: preferredTerm
dtype: string
- name: fi
struct:
- name: alternativeLabel
sequence: string
- name: description
dtype: string
- name: preferredLabel
dtype: string
- name: preferredTerm
dtype: string
- name: sk
struct:
- name: alternativeLabel
sequence: string
- name: description
dtype: string
- name: preferredLabel
dtype: string
- name: preferredTerm
dtype: string
- name: da
struct:
- name: alternativeLabel
sequence: string
- name: description
dtype: string
- name: preferredLabel
dtype: string
- name: preferredTerm
dtype: string
- name: nl
struct:
- name: alternativeLabel
sequence: string
- name: description
dtype: string
- name: preferredLabel
dtype: string
- name: preferredTerm
dtype: string
- name: is
struct:
- name: alternativeLabel
sequence: string
- name: description
dtype: string
- name: preferredLabel
dtype: string
- name: preferredTerm
dtype: string
- name: sl
struct:
- name: alternativeLabel
sequence: string
- name: description
dtype: string
- name: preferredLabel
dtype: string
- name: preferredTerm
dtype: string
- name: hr
struct:
- name: alternativeLabel
sequence: string
- name: description
dtype: string
- name: preferredLabel
dtype: string
- name: preferredTerm
dtype: string
- name: pl
struct:
- name: alternativeLabel
sequence: string
- name: description
dtype: string
- name: preferredLabel
dtype: string
- name: preferredTerm
dtype: string
- name: it
struct:
- name: alternativeLabel
sequence: string
- name: description
dtype: string
- name: preferredLabel
dtype: string
- name: preferredTerm
dtype: string
- name: de
struct:
- name: alternativeLabel
sequence: string
- name: description
dtype: string
- name: preferredLabel
dtype: string
- name: preferredTerm
dtype: string
- name: url
dtype: string
- name: mt
struct:
- name: alternativeLabel
sequence: string
- name: description
dtype: string
- name: preferredLabel
dtype: string
- name: preferredTerm
dtype: string
- name: hu
struct:
- name: alternativeLabel
sequence: string
- name: description
dtype: string
- name: preferredLabel
dtype: string
- name: preferredTerm
dtype: string
- name: fr
struct:
- name: alternativeLabel
sequence: string
- name: description
dtype: string
- name: preferredLabel
dtype: string
- name: preferredTerm
dtype: string
- name: pt
struct:
- name: alternativeLabel
sequence: string
- name: description
dtype: string
- name: preferredLabel
dtype: string
- name: preferredTerm
dtype: string
- name: ro
struct:
- name: alternativeLabel
sequence: string
- name: description
dtype: string
- name: preferredLabel
dtype: string
- name: preferredTerm
dtype: string
splits:
- name: train
num_bytes: 52470213
num_examples: 3629
download_size: 22696020
dataset_size: 52470213
---
# Dataset Card for "esco_occupations_details_multilingual"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
freshpearYoon/train_free_13 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 9604588872
num_examples: 10000
download_size: 1312729438
dataset_size: 9604588872
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
andersonbcdefg/micropile | ---
dataset_info:
features:
- name: text
dtype: string
- name: __id
dtype: int64
splits:
- name: train
num_bytes: 5544284
num_examples: 1000
download_size: 2933209
dataset_size: 5544284
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "micropile"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmrau/cqadubstack-gaming-qrels | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: int64
splits:
- name: test
num_bytes: 60520
num_examples: 2263
download_size: 32524
dataset_size: 60520
---
# Dataset Card for "cqadubstack-gaming-qrels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MouhsineGT/new_fr_200_excel | ---
license: unknown
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.