datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
CyberHarem/kurosawa_dia_lovelivesunshine | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kurosawa_dia/黒澤ダイヤ/쿠로사와다이아 (Love Live! Sunshine!!)
This is the dataset of kurosawa_dia/黒澤ダイヤ/쿠로사와다이아 (Love Live! Sunshine!!), containing 500 images and their tags.
The core tags of this character are `bangs, black_hair, mole, mole_under_mouth, long_hair, hair_ornament, blunt_bangs, green_eyes, hairclip, sidelocks`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 700.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kurosawa_dia_lovelivesunshine/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 377.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kurosawa_dia_lovelivesunshine/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1181 | 811.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kurosawa_dia_lovelivesunshine/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 608.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kurosawa_dia_lovelivesunshine/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1181 | 1.16 GiB | [Download](https://huggingface.co/datasets/CyberHarem/kurosawa_dia_lovelivesunshine/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kurosawa_dia_lovelivesunshine',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, solo, looking_at_viewer, pencil_skirt, long_sleeves, office_lady, skirt_suit, smile, sitting, black_jacket, black_skirt, collared_shirt, white_shirt, dress_shirt |
| 1 | 7 |  |  |  |  |  | 1girl, blush, floral_print, long_sleeves, looking_at_viewer, obi, red_kimono, solo, wide_sleeves, smile, hair_flower, folding_fan, holding_fan, open_mouth, red_flower, upper_body, dated, new_year |
| 2 | 9 |  |  |  |  |  | 1girl, blush, floral_print, hair_flower, kimono, looking_at_viewer, obi, solo, single_hair_bun, smile, upper_body, long_sleeves, hair_up, holding, wide_sleeves, alternate_hairstyle, aqua_eyes, blurry |
| 3 | 16 |  |  |  |  |  | 1girl, looking_at_viewer, solo, obi, floral_print, blush, hair_stick, smile, earrings, short_kimono, collarbone, frilled_sleeves, hair_tubes, aqua_eyes, single_hair_bun, arm_warmers, short_sleeves |
| 4 | 30 |  |  |  |  |  | 1girl, looking_at_viewer, serafuku, solo, uranohoshi_school_uniform, neckerchief, pleated_skirt, grey_skirt, smile, blush, long_sleeves, shirt, open_mouth, short_sleeves, tie_clip, grey_sailor_collar |
| 5 | 5 |  |  |  |  |  | 1girl, serafuku, solo, upper_body, uranohoshi_school_uniform, green_neckerchief, looking_at_viewer, simple_background, white_background, alternate_hair_length, grey_sailor_collar, short_sleeves, :o, hand_up, long_sleeves, nose_blush, open_mouth, parted_lips, short_hair, tie_clip, white_shirt |
| 6 | 10 |  |  |  |  |  | 1girl, bare_shoulders, collarbone, looking_at_viewer, smile, solo, blush, white_background, bow, hair_flower, strapless_dress, closed_mouth, red_dress, simple_background, bare_arms, brown_hair, small_breasts, upper_body |
| 7 | 27 |  |  |  |  |  | 1girl, solo, blush, looking_at_viewer, earrings, aqua_eyes, skirt, bracelet, hair_flower, open_mouth, :d, detached_sleeves, dress |
| 8 | 7 |  |  |  |  |  | 1girl, long_sleeves, looking_at_viewer, smile, solo, blush, pink_bow, red_skirt, hair_bow, bag, day, outdoors, shirt, single_braid, blue_sky, building, cloud, white_jacket |
| 9 | 5 |  |  |  |  |  | 1girl, holding_sword, katana, kimono, looking_at_viewer, solo, hair_bow, long_sleeves, red_bow, smile, upper_body, closed_mouth, floral_print, wide_sleeves, aqua_eyes, brown_hair, full_moon, multicolored_hair, night, simple_background, skirt, unsheathing |
| 10 | 11 |  |  |  |  |  | 1girl, blush, looking_at_viewer, solo, aqua_eyes, cloud, day, bracelet, outdoors, blue_sky, red_bikini, smile, cherry_blossom_print, collarbone, medium_breasts, navel, open_mouth, parted_lips, skirt, thigh_strap |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | pencil_skirt | long_sleeves | office_lady | skirt_suit | smile | sitting | black_jacket | black_skirt | collared_shirt | white_shirt | dress_shirt | blush | floral_print | obi | red_kimono | wide_sleeves | hair_flower | folding_fan | holding_fan | open_mouth | red_flower | upper_body | dated | new_year | kimono | single_hair_bun | hair_up | holding | alternate_hairstyle | aqua_eyes | blurry | hair_stick | earrings | short_kimono | collarbone | frilled_sleeves | hair_tubes | arm_warmers | short_sleeves | serafuku | uranohoshi_school_uniform | neckerchief | pleated_skirt | grey_skirt | shirt | tie_clip | grey_sailor_collar | green_neckerchief | simple_background | white_background | alternate_hair_length | :o | hand_up | nose_blush | parted_lips | short_hair | bare_shoulders | bow | strapless_dress | closed_mouth | red_dress | bare_arms | brown_hair | small_breasts | skirt | bracelet | :d | detached_sleeves | dress | pink_bow | red_skirt | hair_bow | bag | day | outdoors | single_braid | blue_sky | building | cloud | white_jacket | holding_sword | katana | red_bow | full_moon | multicolored_hair | night | unsheathing | red_bikini | cherry_blossom_print | medium_breasts | navel | thigh_strap |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-------|:--------------------|:---------------|:---------------|:--------------|:-------------|:--------|:----------|:---------------|:--------------|:-----------------|:--------------|:--------------|:--------|:---------------|:------|:-------------|:---------------|:--------------|:--------------|:--------------|:-------------|:-------------|:-------------|:--------|:-----------|:---------|:------------------|:----------|:----------|:----------------------|:------------|:---------|:-------------|:-----------|:---------------|:-------------|:------------------|:-------------|:--------------|:----------------|:-----------|:----------------------------|:--------------|:----------------|:-------------|:--------|:-----------|:---------------------|:--------------------|:--------------------|:-------------------|:------------------------|:-----|:----------|:-------------|:--------------|:-------------|:-----------------|:------|:------------------|:---------------|:------------|:------------|:-------------|:----------------|:--------|:-----------|:-----|:-------------------|:--------|:-----------|:------------|:-----------|:------|:------|:-----------|:---------------|:-----------|:-----------|:--------|:---------------|:----------------|:---------|:----------|:------------|:--------------------|:--------|:--------------|:-------------|:-----------------------|:-----------------|:--------|:--------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | | X | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | X | | X | | | X | | | | | | | X | X | X | | X | X | | | | | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 16 |  |  |  |  |  | X | X | X | | | | | X | | | | | | | X | X | X | | | | | | | | | | | | X | | | | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 30 |  |  |  |  |  | X | X | X | | X | | | X | | | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | X | | X | | | | | | | | X | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | X | X | X | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 10 |  |  |  |  |  | X | X | X | | | | | X | | | | | | | X | | | | | X | | | | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | X | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 27 |  |  |  |  |  | X | X | X | | | | | | | | | | | | X | | | | | X | | | X | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 7 |  |  |  |  |  | X | X | X | | X | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | X | X | | X | | | X | | | | | | | | X | | | X | | | | | | X | | | X | | | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | | | X | | X | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | | | | | |
| 10 | 11 |  |  |  |  |  | X | X | X | | | | | X | | | | | | | X | | | | | | | | X | | | | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | X | | | | | | | | X | X | | X | | X | | | | | | | | | X | X | X | X | X |
|
stulcrad/CNEC2_0_flat | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-ah
'2': I-ah
'3': B-at
'4': I-at
'5': B-az
'6': I-az
'7': B-g_
'8': I-g_
'9': B-gc
'10': I-gc
'11': B-gh
'12': I-gh
'13': B-gl
'14': I-gl
'15': B-gq
'16': I-gq
'17': B-gr
'18': I-gr
'19': B-gs
'20': I-gs
'21': B-gt
'22': I-gt
'23': B-gu
'24': I-gu
'25': B-i_
'26': I-i_
'27': B-ia
'28': I-ia
'29': B-ic
'30': I-ic
'31': B-if
'32': I-if
'33': B-io
'34': I-io
'35': B-me
'36': I-me
'37': B-mi
'38': I-mi
'39': B-mn
'40': I-mn
'41': B-ms
'42': I-ms
'43': B-n_
'44': I-n_
'45': B-na
'46': I-na
'47': B-nb
'48': I-nb
'49': B-nc
'50': I-nc
'51': B-ni
'52': I-ni
'53': B-no
'54': I-no
'55': B-ns
'56': I-ns
'57': B-o_
'58': I-o_
'59': B-oa
'60': I-oa
'61': B-oe
'62': I-oe
'63': B-om
'64': I-om
'65': B-op
'66': I-op
'67': B-or
'68': I-or
'69': B-p_
'70': I-p_
'71': B-pc
'72': I-pc
'73': B-pd
'74': I-pd
'75': B-pf
'76': I-pf
'77': B-pm
'78': I-pm
'79': B-pp
'80': I-pp
'81': B-ps
'82': I-ps
'83': B-td
'84': I-td
'85': B-tf
'86': I-tf
'87': B-th
'88': I-th
'89': B-tm
'90': I-tm
'91': B-ty
'92': I-ty
- name: langs
sequence: string
- name: spans
sequence: string
splits:
- name: train
num_bytes: 4456060
num_examples: 7193
- name: validation
num_bytes: 557383
num_examples: 900
- name: test
num_bytes: 560798
num_examples: 899
download_size: 1262119
dataset_size: 5574241
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
language:
- cs
--- |
Hung2003vn/dataset_quy_trinh_02 | ---
license: apache-2.0
---
|
KADUZADA/ROBSON | ---
license: openrail
---
|
open-llm-leaderboard/details_DevaMalla__llama7b_alpaca_1gpu_bf16 | ---
pretty_name: Evaluation run of DevaMalla/llama7b_alpaca_1gpu_bf16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DevaMalla/llama7b_alpaca_1gpu_bf16](https://huggingface.co/DevaMalla/llama7b_alpaca_1gpu_bf16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DevaMalla__llama7b_alpaca_1gpu_bf16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T19:12:43.591468](https://huggingface.co/datasets/open-llm-leaderboard/details_DevaMalla__llama7b_alpaca_1gpu_bf16/blob/main/results_2023-09-22T19-12-43.591468.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001363255033557047,\n\
\ \"em_stderr\": 0.00037786091964608015,\n \"f1\": 0.05976300335570471,\n\
\ \"f1_stderr\": 0.0013045567269002268,\n \"acc\": 0.38738538738957606,\n\
\ \"acc_stderr\": 0.009113781208674253\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001363255033557047,\n \"em_stderr\": 0.00037786091964608015,\n\
\ \"f1\": 0.05976300335570471,\n \"f1_stderr\": 0.0013045567269002268\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.045489006823351025,\n \
\ \"acc_stderr\": 0.005739657656722204\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7292817679558011,\n \"acc_stderr\": 0.012487904760626303\n\
\ }\n}\n```"
repo_url: https://huggingface.co/DevaMalla/llama7b_alpaca_1gpu_bf16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|arc:challenge|25_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T19_12_43.591468
path:
- '**/details_harness|drop|3_2023-09-22T19-12-43.591468.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T19-12-43.591468.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T19_12_43.591468
path:
- '**/details_harness|gsm8k|5_2023-09-22T19-12-43.591468.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T19-12-43.591468.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hellaswag|10_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T10:17:49.749998.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T10:17:49.749998.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T10:17:49.749998.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T19_12_43.591468
path:
- '**/details_harness|winogrande|5_2023-09-22T19-12-43.591468.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T19-12-43.591468.parquet'
- config_name: results
data_files:
- split: 2023_08_30T10_17_49.749998
path:
- results_2023-08-30T10:17:49.749998.parquet
- split: 2023_09_22T19_12_43.591468
path:
- results_2023-09-22T19-12-43.591468.parquet
- split: latest
path:
- results_2023-09-22T19-12-43.591468.parquet
---
# Dataset Card for Evaluation run of DevaMalla/llama7b_alpaca_1gpu_bf16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/DevaMalla/llama7b_alpaca_1gpu_bf16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [DevaMalla/llama7b_alpaca_1gpu_bf16](https://huggingface.co/DevaMalla/llama7b_alpaca_1gpu_bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DevaMalla__llama7b_alpaca_1gpu_bf16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T19:12:43.591468](https://huggingface.co/datasets/open-llm-leaderboard/details_DevaMalla__llama7b_alpaca_1gpu_bf16/blob/main/results_2023-09-22T19-12-43.591468.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001363255033557047,
"em_stderr": 0.00037786091964608015,
"f1": 0.05976300335570471,
"f1_stderr": 0.0013045567269002268,
"acc": 0.38738538738957606,
"acc_stderr": 0.009113781208674253
},
"harness|drop|3": {
"em": 0.001363255033557047,
"em_stderr": 0.00037786091964608015,
"f1": 0.05976300335570471,
"f1_stderr": 0.0013045567269002268
},
"harness|gsm8k|5": {
"acc": 0.045489006823351025,
"acc_stderr": 0.005739657656722204
},
"harness|winogrande|5": {
"acc": 0.7292817679558011,
"acc_stderr": 0.012487904760626303
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
davanstrien/fuego-20230322-211033-00ad7c | ---
tags:
- fuego
fuego:
id: 20230322-211033-00ad7c
status: running
script: script.py
requirements_file: requirements.txt
space_id: davanstrien/fuego-20230322-211033-00ad7c
space_hardware: cpu-basic
---
|
fairlabs/fairlabs-esg-sentiment-data-balance | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 10723634.409845466
num_examples: 53608
- name: validation
num_bytes: 2681108.64041111
num_examples: 13403
- name: test
num_bytes: 200037.94974342384
num_examples: 1000
download_size: 7417014
dataset_size: 13604781.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
katxtong/tokenized_squad_validation_size384 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: offset_mapping
sequence:
sequence: int64
- name: example_id
dtype: string
splits:
- name: validation
num_bytes: 65884992
num_examples: 10784
download_size: 6124969
dataset_size: 65884992
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
|
sandvenu/resume-dataset | ---
dataset_info:
features:
- name: ID
dtype: int64
- name: Resume_str
dtype: string
- name: Resume_html
dtype: string
- name: Category
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 32750676
num_examples: 1490
- name: validation
num_bytes: 11125779
num_examples: 497
- name: test
num_bytes: 10943410
num_examples: 497
download_size: 20318976
dataset_size: 54819865
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Romildon/locutor | ---
license: openrail
---
|
autoevaluate/autoeval-eval-multi_nli-default-544a62-53715145359 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- multi_nli
eval_info:
task: natural_language_inference
model: roberta-large-mnli
metrics: []
dataset_name: multi_nli
dataset_config: default
dataset_split: validation_matched
col_mapping:
text1: premise
text2: hypothesis
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Natural Language Inference
* Model: roberta-large-mnli
* Dataset: multi_nli
* Config: default
* Split: validation_matched
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@kslnet](https://huggingface.co/kslnet) for evaluating this model. |
modelloosrvcc/Toad | ---
license: openrail
---
|
Columbia-NLP/ruozhiba_en | ---
dataset_info:
features:
- name: source
dtype: string
- name: instruction
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: followup_question
dtype: string
- name: model
dtype: string
splits:
- name: train_sft
num_bytes: 954797
num_examples: 238
download_size: 548182
dataset_size: 954797
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
size_categories:
- n<1K
---
# Ruozhiba English Data
Based on the findings from [COIG-CQIA](https://arxiv.org/html/2403.18058v1), Ruozhiba data is a high-quality instruction tuning dataset that can greatly improve supervised fine-tuning models' performance.
We translated the 240 instructions in Ruozhiba from Chinese to English.
We filtered out and modified some instructions are language/cultural related.
Some Chinese instructions are kept to maintain their original meaning.
Finally, we re-generate the response using `gpt-4-turbo` and add one additional turn to improve robustness.
## MT-Bench
We use GPT-4-0125-preview as Judge. On MT-Bench, [ruozhiba_en](https://huggingface.co/datasets/qywu/ruozhiba_en) data has achieved comparable performance compared to [ultrachat_200k](https://huggingface.co/datasets/HuggingFaceH4/ultrachat_200k) dataset.
| Model | Total | Coding | Extraction | Humanities | Math | Reasoning | Roleplay | STEM | Writing |
|--------------------------------------------|-------|--------|------------|------------|------|-----------|----------|------|---------|
| alignment-handbook/zephyr-7b-sft-full | 5.6 | 3.95 | 6.75 | 7.5 | 3.1 | 4.05 | 6.15 | 6.1 | 7.2 |
| zephyr-7b-sft-ruozhiba | 5.88 | 3.75 | 6.45 | 8.11 | 2.7 | 4.2 | 7.4 | 7.4 | 7.15 | |
Waflon/FAQ | ---
language:
- es
--- |
jahb57/bert_embeddings_BATCH_4 | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: last_hidden_state
sequence:
sequence: float32
- name: pooler_output
sequence: float32
splits:
- name: train
num_bytes: 19673418221
num_examples: 100000
download_size: 19797895899
dataset_size: 19673418221
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ArtifactClfDurham/OrientalMuseum-3Dwhite | ---
dataset_info:
features:
- name: obj_num
dtype: string
- name: file
dtype: string
- name: image
dtype: image
- name: root
dtype: string
- name: description
dtype: string
- name: object_name
dtype: string
- name: other_name
dtype: string
- name: material
dtype: string
- name: production.period
dtype: string
- name: production.place
dtype: string
- name: new_root
dtype: string
- name: original
dtype: bool
splits:
- name: train
num_bytes: 4637401728.075
num_examples: 186525
download_size: 4895658097
dataset_size: 4637401728.075
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
arkanbima/ns-en-id | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 11371299
num_examples: 38462
- name: validation
num_bytes: 498293
num_examples: 1953
- name: test
num_bytes: 491317
num_examples: 1954
download_size: 7886554
dataset_size: 12360909
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
atmallen/quirky_popqa_pythia-410m_bob_hard | ---
dataset_info:
features:
- name: id
dtype: string
- name: choices
sequence: string
- name: label
dtype: int64
- name: popularity
dtype: int64
- name: difficulty
dtype: float64
- name: statement
dtype: string
- name: character
dtype: string
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: bob_log_odds
dtype: float64
splits:
- name: train
num_bytes: 956816.9929078014
num_examples: 6134
- name: validation
num_bytes: 81519.174
num_examples: 522
- name: test
num_bytes: 77372.528
num_examples: 496
download_size: 396803
dataset_size: 1115708.6949078015
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
chirunder/MixAtis_for_DecoderOnly_90-10_split-HALF | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 6669000.899910009
num_examples: 9000
- name: test
num_bytes: 741741.100089991
num_examples: 1001
download_size: 1874389
dataset_size: 7410742.0
---
# Dataset Card for "MixAtis_for_DecoderOnly_90-10_split-HALF"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
omar07ibrahim/AZERBAIJAN-ENGLISH-DATASET | ---
license: cc-by-4.0
---
|
imvladikon/knesset_meetings_corpus | ---
annotations_creators:
- no-annotation
language_creators:
- crowdsourced
language:
- he
license:
- pddl
multilinguality:
- monolingual
size_categories:
- n<1K
source_datasets:
- original
task_categories:
- text-generation
task_ids:
- language-modeling
pretty_name: Knesset Meetings Corpus
---
# Dataset Card
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://zenodo.org/record/2707356](https://zenodo.org/record/2707356)
- **Repository:** [https://github.com/NLPH/knesset-2004-2005](https://github.com/NLPH/knesset-2004-2005)
- **Paper:**
- **Point of Contact:**
- **Size of downloaded dataset files:**
- **Size of the generated dataset:**
- **Total amount of disk used:**
### Dataset Summary
An example of a sample:
```
{
"text": <text content of given document>,
"path": <file path to docx>
}
```
Dataset usage
Available "kneset16","kneset17","knesset_tagged" configurations
And only train set.
```python
train_ds = load_dataset("imvladikon/knesset_meetings_corpus", "kneset16", split="train")
```
The Knesset Meetings Corpus 2004-2005 is made up of two components:
* Raw texts - 282 files made up of 867,725 lines together. These can be downloaded in two formats:
* As ``doc`` files, encoded using ``windows-1255`` encoding:
* ``kneset16.zip`` - Contains 164 text files made up of 543,228 lines together. `[MILA host] <http://yeda.cs.technion.ac.il:8088/corpus/software/corpora/knesset/txt/docs/kneset16.zip>`_ `[Github Mirror] <https://github.com/NLPH/knesset-2004-2005/blob/master/kneset16.zip?raw=true>`_
* ``kneset17.zip`` - Contains 118 text files made up of 324,497 lines together. `[MILA host] <http://yeda.cs.technion.ac.il:8088/corpus/software/corpora/knesset/txt/docs/kneset17.zip>`_ `[Github Mirror] <https://github.com/NLPH/knesset-2004-2005/blob/master/kneset17.zip?raw=true>`_
* As ``txt`` files, encoded using ``utf8`` encoding:
* ``kneset.tar.gz`` - An archive of all the raw text files, divided into two folders: `[Github mirror] <https://github.com/NLPH/knesset-2004-2005/blob/master/kneset.tar.gz>`_
* ``16`` - Contains 164 text files made up of 543,228 lines together.
* ``17`` - Contains 118 text files made up of 324,497 lines together.
* ``knesset_txt_16.tar.gz``- Contains 164 text files made up of 543,228 lines together. `[MILA host] <http://yeda.cs.technion.ac.il:8088/corpus/software/corpora/knesset/txt/utf8/knesset_txt_16.tar.gz>`_ `[Github Mirror] <https://github.com/NLPH/knesset-2004-2005/blob/master/knesset_txt_16.tar.gz?raw=true>`_
* ``knesset_txt_17.zip`` - Contains 118 text files made up of 324,497 lines together. `[MILA host] <http://yeda.cs.technion.ac.il:8088/corpus/software/corpora/knesset/txt/utf8/knesset_txt_17.zip>`_ `[Github Mirror] <https://github.com/NLPH/knesset-2004-2005/blob/master/knesset_txt_17.zip?raw=true>`_
* Tokenized and morphologically tagged texts - Tagged versions exist only for the files in the ``16`` folder. The texts are encoded using `MILA's XML schema for corpora <http://www.mila.cs.technion.ac.il/eng/resources_standards.html>`_. These can be downloaded in two ways:
* ``knesset_tagged_16.tar.gz`` - An archive of all tokenized and tagged files. `[MILA host] <http://yeda.cs.technion.ac.il:8088/corpus/software/corpora/knesset/tagged/knesset_tagged_16.tar.gz>`_ `[Archive.org mirror] <https://archive.org/details/knesset_transcripts_2004_2005>`_
Mirrors
-------
This repository is a mirror of this dataset `found on MILA's website <http://www.mila.cs.technion.ac.il/eng/resources_corpora_haknesset.html>`_.
Zenodo mirror: `https://zenodo.org/record/2707356 <https://zenodo.org/record/2707356>`_
License
-------
All Knesset meeting protocols are in the `public domain <https://en.wikipedia.org/wiki/Public_domain>`_ (`רשות הציבור <https://he.wikipedia.org/wiki/%D7%A8%D7%A9%D7%95%D7%AA_%D7%94%D7%A6%D7%99%D7%91%D7%95%D7%A8>`_) by law. These files are thus in the public doamin and do not require any license or public domain dedication to set their status.
.. |DOI| image:: https://zenodo.org/badge/DOI/10.5281/zenodo.2707356.svg
:target: https://doi.org/10.5281/zenodo.2707356
.. |LICENCE| image:: https://github.com/NLPH/knesset-2004-2005/blob/master/public_domain_shield.svg
:target: https://en.wikipedia.org/wiki/Public_domain
.. |PUBDOM| image:: https://github.com/NLPH/knesset-2004-2005/blob/master/public_domain.png
:target: https://en.wikipedia.org/wiki/Public_domain
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
The dataset is available under the [ Open Data Commons Public Domain Dedication & License 1.0](https://opendatacommons.org/licenses/pddl/).
### Citation Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Contributions
|
dary/agagga_oaoa | ---
license: openrail
---
|
BarrenWardo/SDControlNets | ---
license: unknown
---
|
erhwenkuo/squad-cmrc2018-zhtw | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
splits:
- name: train
num_bytes: 14839890
num_examples: 10142
- name: validation
num_bytes: 4976411
num_examples: 3219
- name: test
num_bytes: 1534360
num_examples: 1002
download_size: 4781898
dataset_size: 21350661
license: cc-by-sa-4.0
task_categories:
- question-answering
language:
- zh
size_categories:
- 10K<n<100K
---
# Dataset Card for "squad-cmrc2018-zhtw"
## 資料集摘要
[CMRC 2018](https://hfl-rc.github.io/cmrc2018/) 是第二屆「訊飛盃」中文機器閱讀理解頒獎研討會(CMRC 2018)中相關競賽所使用的資料集。
它主要用於中文機器閱讀理解的跨度提取資料集,以增加該領域的語言多樣性。該資料集由人類專家在維基百科段落上註釋的近 20,000 個真實問題組成。
同時它也註釋了一個挑戰集,其中包含需要在整個上下文中進行全面理解和多句推理的問題。
原始資料來源:
- https://hfl-rc.github.io/cmrc2018/
- https://github.com/ymcui/cmrc2018
## 資料下載清理
1. 下載 [cmrc2018](https://huggingface.co/datasets/cmrc2018) 資料集
2. 使用 [OpenCC](https://github.com/yichen0831/opencc-python) 來進行簡繁轉換
3. 使用 Python 正規表示式來清理一些殘留在 `context`, `question`, `answer` 的不必要字元
4. 根據 `answers.text` 來重新計算 `answers.answer_start` 的字元位置
5. 使用 Huggingface Datasets 來上傳至 Huggingface Hub
## 資料集結構
範例如下:
```
{
"id":"DEV_1889_QUERY_0",
"context":"巴士底廣場是法國首都巴黎的一個廣場是法國大革命的重要紀念地方。過去是巴士底獄所在地直到攻佔巴士底獄隨後在法國革命期間的1789年7月14日到1790年7月14日之間被徹底破壞沒有留下任何痕跡。這個廣場跨巴黎市的3個區:第四區、第十一區和第十二區。這個廣場和周邊地區簡稱為“巴士底”。立於廣場中心的七月圓柱由路易-菲利普一世興建於1833年到1840年是為了紀念1830年的七月革命。其他顯著的特徵包括巴士底歌劇院、巴士底地鐵站以及一段聖馬丁運河。在1984年以前歌劇院所在的地方曾經是巴士底火車站。這個廣場經常舉辦音樂會或類似活動。巴士底的東北部擁有許多咖啡館、酒吧、夜總會和音樂廳夜生活頗為熱鬧。由於這個廣場具有相當的歷史意義也經常用於政治示威包括大規模的2006年3月28日法國勞工抗議。在巴士底廣場交匯的道路有聖安託萬路、聖安託萬市郊路、亨利四世大道、里昂路、勒努瓦大道、博馬舍大道等。",
"question":"巴士底廣場是哪場革命的重要紀念地方?",
"answers":{
"text":[
"法國大革命"
],
"answer_start":[
18
]
}
}
```
## 資料欄位
所有配置(Split)的資料欄位都是相同的:
- `id`: (string) 編號
- `context`: (string) 問題內容的上下文
- `question`: (string) 問題
- `answers`: 問題回答(基於內容的上下文來提取), 在SQuAD的結構裡, `text` 與 `answer_start` 是一個 list 列表
- `text`: list(string) 問題的答案
- `answer_start`: list(int) 問題的答案位於 `context` 上下文中的位置
## 資料分割
這個資料集總有下列的分割(split)子集:
- `train`: 10,142 筆
- `test`: 1,002 筆
- `validation`: 3,219 筆
## 如何使用
```python
from datasets import load_dataset
# 請使用 `split="train"` 參數來指定要使用的分割(split)
dataset = load_dataset("erhwenkuo/squad-cmrc2018-zhtw", split="train")
```
詳細的教學可參考:
- [NLP 課程-問答系統](https://huggingface.co/learn/nlp-course/zh-TW/chapter7/7?fw=pt)
## 許可資訊
CC BY-SA 4.0
## 論文引用
```
@inproceedings{cui-emnlp2019-cmrc2018,
title = "A Span-Extraction Dataset for {C}hinese Machine Reading Comprehension",
author = "Cui, Yiming and
Liu, Ting and
Che, Wanxiang and
Xiao, Li and
Chen, Zhipeng and
Ma, Wentao and
Wang, Shijin and
Hu, Guoping",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)",
month = nov,
year = "2019",
address = "Hong Kong, China",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/D19-1600",
doi = "10.18653/v1/D19-1600",
pages = "5886--5891",
}
```
|
manishiitg/CogStack-Conv | ---
dataset_info:
features:
- name: org_text
dtype: string
- name: raw_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 10648396
num_examples: 2354
download_size: 4791241
dataset_size: 10648396
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jay401521/test | ---
dataset_info:
features:
- name: id
dtype: int64
- name: domain
dtype: string
- name: label
dtype: int64
- name: rank
dtype: int64
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 2768369
num_examples: 30021
download_size: 1371145
dataset_size: 2768369
---
# Dataset Card for "test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/docvqa_test_Salesforce_blip2-flan-t5-xxl_ns_100 | ---
dataset_info:
features:
- name: question
dtype: string
- name: id
dtype: int64
- name: answers
sequence: string
- name: generated_answer
dtype: string
splits:
- name: train
num_bytes: 8755
num_examples: 100
download_size: 7850
dataset_size: 8755
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
iNeil77/commit-chronicle | ---
dataset_info:
- config_name: C
features:
- name: author
dtype: int64
- name: date
dtype: string
- name: timezone
dtype: int64
- name: hash
dtype: string
- name: message
dtype: string
- name: mods
list:
- name: change_type
dtype: string
- name: old_path
dtype: string
- name: new_path
dtype: string
- name: diff
dtype: string
- name: language
dtype: string
- name: license
dtype: string
- name: repo
dtype: string
- name: original_message
dtype: string
splits:
- name: train
num_bytes: 1214269026.0285635
num_examples: 309153
- name: validation
num_bytes: 220284785.83363256
num_examples: 57970
- name: test
num_bytes: 148589006.99135485
num_examples: 38340
download_size: 516619057
dataset_size: 1583142818.853551
- config_name: C++
features:
- name: author
dtype: int64
- name: date
dtype: string
- name: timezone
dtype: int64
- name: hash
dtype: string
- name: message
dtype: string
- name: mods
list:
- name: change_type
dtype: string
- name: old_path
dtype: string
- name: new_path
dtype: string
- name: diff
dtype: string
- name: language
dtype: string
- name: license
dtype: string
- name: repo
dtype: string
- name: original_message
dtype: string
splits:
- name: train
num_bytes: 3262697231.9482107
num_examples: 830683
- name: validation
num_bytes: 766516575.1115581
num_examples: 201716
- name: test
num_bytes: 479503779.0820391
num_examples: 123725
download_size: 1779547046
dataset_size: 4508717586.141808
- config_name: Go
features:
- name: author
dtype: int64
- name: date
dtype: string
- name: timezone
dtype: int64
- name: hash
dtype: string
- name: message
dtype: string
- name: mods
list:
- name: change_type
dtype: string
- name: old_path
dtype: string
- name: new_path
dtype: string
- name: diff
dtype: string
- name: language
dtype: string
- name: license
dtype: string
- name: repo
dtype: string
- name: original_message
dtype: string
splits:
- name: train
num_bytes: 2639610249.9324474
num_examples: 672045
- name: validation
num_bytes: 509022394.3687841
num_examples: 133954
- name: test
num_bytes: 522034184.995527
num_examples: 134699
download_size: 1392783035
dataset_size: 3670666829.2967587
- config_name: Objective-C
features:
- name: author
dtype: int64
- name: date
dtype: string
- name: timezone
dtype: int64
- name: hash
dtype: string
- name: message
dtype: string
- name: mods
list:
- name: change_type
dtype: string
- name: old_path
dtype: string
- name: new_path
dtype: string
- name: diff
dtype: string
- name: language
dtype: string
- name: license
dtype: string
- name: repo
dtype: string
- name: original_message
dtype: string
splits:
- name: train
num_bytes: 127717945.2224976
num_examples: 32517
- name: validation
num_bytes: 4917172.897511136
num_examples: 1294
- name: test
num_bytes: 29872823.836446613
num_examples: 7708
download_size: 52374411
dataset_size: 162507941.95645535
- config_name: Python
features:
- name: author
dtype: int64
- name: date
dtype: string
- name: timezone
dtype: int64
- name: hash
dtype: string
- name: message
dtype: string
- name: mods
list:
- name: change_type
dtype: string
- name: old_path
dtype: string
- name: new_path
dtype: string
- name: diff
dtype: string
- name: language
dtype: string
- name: license
dtype: string
- name: repo
dtype: string
- name: original_message
dtype: string
splits:
- name: train
num_bytes: 5224487604.251047
num_examples: 1330155
- name: validation
num_bytes: 807734947.9240026
num_examples: 212563
- name: test
num_bytes: 958895166.8964008
num_examples: 247421
download_size: 2161676583
dataset_size: 6991117719.07145
- config_name: Ruby
features:
- name: author
dtype: int64
- name: date
dtype: string
- name: timezone
dtype: int64
- name: hash
dtype: string
- name: message
dtype: string
- name: mods
list:
- name: change_type
dtype: string
- name: old_path
dtype: string
- name: new_path
dtype: string
- name: diff
dtype: string
- name: language
dtype: string
- name: license
dtype: string
- name: repo
dtype: string
- name: original_message
dtype: string
splits:
- name: train
num_bytes: 714516644.312079
num_examples: 181916
- name: validation
num_bytes: 151664764.05368194
num_examples: 39912
- name: test
num_bytes: 129571629.38815771
num_examples: 33433
download_size: 243994774
dataset_size: 995753037.7539186
- config_name: Rust
features:
- name: author
dtype: int64
- name: date
dtype: string
- name: timezone
dtype: int64
- name: hash
dtype: string
- name: message
dtype: string
- name: mods
list:
- name: change_type
dtype: string
- name: old_path
dtype: string
- name: new_path
dtype: string
- name: diff
dtype: string
- name: language
dtype: string
- name: license
dtype: string
- name: repo
dtype: string
- name: original_message
dtype: string
splits:
- name: train
num_bytes: 942800148.1493574
num_examples: 240037
- name: validation
num_bytes: 230993126.81136546
num_examples: 60788
- name: test
num_bytes: 175047461.6269829
num_examples: 45167
download_size: 541549356
dataset_size: 1348840736.5877059
- config_name: Swift
features:
- name: author
dtype: int64
- name: date
dtype: string
- name: timezone
dtype: int64
- name: hash
dtype: string
- name: message
dtype: string
- name: mods
list:
- name: change_type
dtype: string
- name: old_path
dtype: string
- name: new_path
dtype: string
- name: diff
dtype: string
- name: language
dtype: string
- name: license
dtype: string
- name: repo
dtype: string
- name: original_message
dtype: string
splits:
- name: train
num_bytes: 397776768.5968331
num_examples: 101274
- name: validation
num_bytes: 107262008.79292645
num_examples: 28227
- name: test
num_bytes: 34639763.81034767
num_examples: 8938
download_size: 181314627
dataset_size: 539678541.2001072
configs:
- config_name: C
data_files:
- split: train
path: C/train-*
- split: validation
path: C/validation-*
- split: test
path: C/test-*
- config_name: C++
data_files:
- split: train
path: C++/train-*
- split: validation
path: C++/validation-*
- split: test
path: C++/test-*
- config_name: Go
data_files:
- split: train
path: Go/train-*
- split: validation
path: Go/validation-*
- split: test
path: Go/test-*
- config_name: Objective-C
data_files:
- split: train
path: Objective-C/train-*
- split: validation
path: Objective-C/validation-*
- split: test
path: Objective-C/test-*
- config_name: Python
data_files:
- split: train
path: Python/train-*
- split: validation
path: Python/validation-*
- split: test
path: Python/test-*
- config_name: Ruby
data_files:
- split: train
path: Ruby/train-*
- split: validation
path: Ruby/validation-*
- split: test
path: Ruby/test-*
- config_name: Rust
data_files:
- split: train
path: Rust/train-*
- split: validation
path: Rust/validation-*
- split: test
path: Rust/test-*
- config_name: Swift
data_files:
- split: train
path: Swift/train-*
- split: validation
path: Swift/validation-*
- split: test
path: Swift/test-*
---
|
Mithilss/download_quick | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_frankenmerger__gemoy-4b-instruct | ---
pretty_name: Evaluation run of frankenmerger/gemoy-4b-instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [frankenmerger/gemoy-4b-instruct](https://huggingface.co/frankenmerger/gemoy-4b-instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_frankenmerger__gemoy-4b-instruct\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T10:59:13.672299](https://huggingface.co/datasets/open-llm-leaderboard/details_frankenmerger__gemoy-4b-instruct/blob/main/results_2024-03-10T10-59-13.672299.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3635342339637508,\n\
\ \"acc_stderr\": 0.03346560799526674,\n \"acc_norm\": 0.36857377594697643,\n\
\ \"acc_norm_stderr\": 0.03436928129673128,\n \"mc1\": 0.2729498164014688,\n\
\ \"mc1_stderr\": 0.015594753632006518,\n \"mc2\": 0.46641168216975853,\n\
\ \"mc2_stderr\": 0.016269583261373614\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3728668941979522,\n \"acc_stderr\": 0.014131176760131167,\n\
\ \"acc_norm\": 0.4069965870307167,\n \"acc_norm_stderr\": 0.01435639941800913\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.44981079466241786,\n\
\ \"acc_stderr\": 0.004964579685712441,\n \"acc_norm\": 0.5802628958374826,\n\
\ \"acc_norm_stderr\": 0.004925072159723828\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.362962962962963,\n\
\ \"acc_stderr\": 0.041539484047424,\n \"acc_norm\": 0.362962962962963,\n\
\ \"acc_norm_stderr\": 0.041539484047424\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.32894736842105265,\n \"acc_stderr\": 0.038234289699266046,\n\
\ \"acc_norm\": 0.32894736842105265,\n \"acc_norm_stderr\": 0.038234289699266046\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.05021167315686781,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.05021167315686781\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4075471698113208,\n \"acc_stderr\": 0.030242233800854494,\n\
\ \"acc_norm\": 0.4075471698113208,\n \"acc_norm_stderr\": 0.030242233800854494\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.039420826399272135,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.039420826399272135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\"\
: 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2832369942196532,\n\
\ \"acc_stderr\": 0.034355680560478746,\n \"acc_norm\": 0.2832369942196532,\n\
\ \"acc_norm_stderr\": 0.034355680560478746\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.34893617021276596,\n \"acc_stderr\": 0.031158522131357783,\n\
\ \"acc_norm\": 0.34893617021276596,\n \"acc_norm_stderr\": 0.031158522131357783\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2830687830687831,\n \"acc_stderr\": 0.023201392938194974,\n \"\
acc_norm\": 0.2830687830687831,\n \"acc_norm_stderr\": 0.023201392938194974\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.03893259610604674,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.03893259610604674\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.35161290322580646,\n\
\ \"acc_stderr\": 0.027162537826948458,\n \"acc_norm\": 0.35161290322580646,\n\
\ \"acc_norm_stderr\": 0.027162537826948458\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.22660098522167488,\n \"acc_stderr\": 0.029454863835292992,\n\
\ \"acc_norm\": 0.22660098522167488,\n \"acc_norm_stderr\": 0.029454863835292992\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.40606060606060607,\n \"acc_stderr\": 0.03834816355401181,\n\
\ \"acc_norm\": 0.40606060606060607,\n \"acc_norm_stderr\": 0.03834816355401181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4797979797979798,\n \"acc_stderr\": 0.035594435655639196,\n \"\
acc_norm\": 0.4797979797979798,\n \"acc_norm_stderr\": 0.035594435655639196\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.47150259067357514,\n \"acc_stderr\": 0.036025735712884414,\n\
\ \"acc_norm\": 0.47150259067357514,\n \"acc_norm_stderr\": 0.036025735712884414\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3487179487179487,\n \"acc_stderr\": 0.024162780284017717,\n\
\ \"acc_norm\": 0.3487179487179487,\n \"acc_norm_stderr\": 0.024162780284017717\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.21481481481481482,\n \"acc_stderr\": 0.025040443877000683,\n \
\ \"acc_norm\": 0.21481481481481482,\n \"acc_norm_stderr\": 0.025040443877000683\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.31932773109243695,\n \"acc_stderr\": 0.030283995525884396,\n\
\ \"acc_norm\": 0.31932773109243695,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.45504587155963305,\n \"acc_stderr\": 0.021350503090925167,\n \"\
acc_norm\": 0.45504587155963305,\n \"acc_norm_stderr\": 0.021350503090925167\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.22685185185185186,\n \"acc_stderr\": 0.02856165010242226,\n \"\
acc_norm\": 0.22685185185185186,\n \"acc_norm_stderr\": 0.02856165010242226\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4117647058823529,\n \"acc_stderr\": 0.034542365853806094,\n \"\
acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.034542365853806094\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.45569620253164556,\n \"acc_stderr\": 0.03241920684693334,\n \
\ \"acc_norm\": 0.45569620253164556,\n \"acc_norm_stderr\": 0.03241920684693334\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.35874439461883406,\n\
\ \"acc_stderr\": 0.03219079200419997,\n \"acc_norm\": 0.35874439461883406,\n\
\ \"acc_norm_stderr\": 0.03219079200419997\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4122137404580153,\n \"acc_stderr\": 0.04317171194870254,\n\
\ \"acc_norm\": 0.4122137404580153,\n \"acc_norm_stderr\": 0.04317171194870254\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5537190082644629,\n \"acc_stderr\": 0.0453793517794788,\n \"acc_norm\"\
: 0.5537190082644629,\n \"acc_norm_stderr\": 0.0453793517794788\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.46296296296296297,\n\
\ \"acc_stderr\": 0.04820403072760627,\n \"acc_norm\": 0.46296296296296297,\n\
\ \"acc_norm_stderr\": 0.04820403072760627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3312883435582822,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.3312883435582822,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285712,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285712\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.4368932038834951,\n \"acc_stderr\": 0.04911147107365777,\n\
\ \"acc_norm\": 0.4368932038834951,\n \"acc_norm_stderr\": 0.04911147107365777\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5769230769230769,\n\
\ \"acc_stderr\": 0.032366121762202014,\n \"acc_norm\": 0.5769230769230769,\n\
\ \"acc_norm_stderr\": 0.032366121762202014\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.421455938697318,\n\
\ \"acc_stderr\": 0.017657976412654857,\n \"acc_norm\": 0.421455938697318,\n\
\ \"acc_norm_stderr\": 0.017657976412654857\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.44508670520231214,\n \"acc_stderr\": 0.02675625512966377,\n\
\ \"acc_norm\": 0.44508670520231214,\n \"acc_norm_stderr\": 0.02675625512966377\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n\
\ \"acc_stderr\": 0.01446589382985993,\n \"acc_norm\": 0.24916201117318434,\n\
\ \"acc_norm_stderr\": 0.01446589382985993\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.46405228758169936,\n \"acc_stderr\": 0.02855582751652879,\n\
\ \"acc_norm\": 0.46405228758169936,\n \"acc_norm_stderr\": 0.02855582751652879\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3665594855305466,\n\
\ \"acc_stderr\": 0.02736807824397163,\n \"acc_norm\": 0.3665594855305466,\n\
\ \"acc_norm_stderr\": 0.02736807824397163\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4166666666666667,\n \"acc_stderr\": 0.02743162372241502,\n\
\ \"acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.02743162372241502\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2907801418439716,\n \"acc_stderr\": 0.027090664368353178,\n \
\ \"acc_norm\": 0.2907801418439716,\n \"acc_norm_stderr\": 0.027090664368353178\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3259452411994785,\n\
\ \"acc_stderr\": 0.01197150729498278,\n \"acc_norm\": 0.3259452411994785,\n\
\ \"acc_norm_stderr\": 0.01197150729498278\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.25735294117647056,\n \"acc_stderr\": 0.026556519470041513,\n\
\ \"acc_norm\": 0.25735294117647056,\n \"acc_norm_stderr\": 0.026556519470041513\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3545751633986928,\n \"acc_stderr\": 0.019353360547553714,\n \
\ \"acc_norm\": 0.3545751633986928,\n \"acc_norm_stderr\": 0.019353360547553714\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.38181818181818183,\n\
\ \"acc_stderr\": 0.04653429807913509,\n \"acc_norm\": 0.38181818181818183,\n\
\ \"acc_norm_stderr\": 0.04653429807913509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5020408163265306,\n \"acc_stderr\": 0.0320089533497105,\n\
\ \"acc_norm\": 0.5020408163265306,\n \"acc_norm_stderr\": 0.0320089533497105\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3781094527363184,\n\
\ \"acc_stderr\": 0.03428867848778657,\n \"acc_norm\": 0.3781094527363184,\n\
\ \"acc_norm_stderr\": 0.03428867848778657\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39156626506024095,\n\
\ \"acc_stderr\": 0.03799857454479636,\n \"acc_norm\": 0.39156626506024095,\n\
\ \"acc_norm_stderr\": 0.03799857454479636\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4152046783625731,\n \"acc_stderr\": 0.03779275945503201,\n\
\ \"acc_norm\": 0.4152046783625731,\n \"acc_norm_stderr\": 0.03779275945503201\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2729498164014688,\n\
\ \"mc1_stderr\": 0.015594753632006518,\n \"mc2\": 0.46641168216975853,\n\
\ \"mc2_stderr\": 0.016269583261373614\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5943172849250198,\n \"acc_stderr\": 0.013800206336014203\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/frankenmerger/gemoy-4b-instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|arc:challenge|25_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|gsm8k|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hellaswag|10_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|winogrande|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T10-59-13.672299.parquet'
- config_name: results
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- results_2024-03-10T10-59-13.672299.parquet
- split: latest
path:
- results_2024-03-10T10-59-13.672299.parquet
---
# Dataset Card for Evaluation run of frankenmerger/gemoy-4b-instruct
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [frankenmerger/gemoy-4b-instruct](https://huggingface.co/frankenmerger/gemoy-4b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_frankenmerger__gemoy-4b-instruct",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T10:59:13.672299](https://huggingface.co/datasets/open-llm-leaderboard/details_frankenmerger__gemoy-4b-instruct/blob/main/results_2024-03-10T10-59-13.672299.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3635342339637508,
"acc_stderr": 0.03346560799526674,
"acc_norm": 0.36857377594697643,
"acc_norm_stderr": 0.03436928129673128,
"mc1": 0.2729498164014688,
"mc1_stderr": 0.015594753632006518,
"mc2": 0.46641168216975853,
"mc2_stderr": 0.016269583261373614
},
"harness|arc:challenge|25": {
"acc": 0.3728668941979522,
"acc_stderr": 0.014131176760131167,
"acc_norm": 0.4069965870307167,
"acc_norm_stderr": 0.01435639941800913
},
"harness|hellaswag|10": {
"acc": 0.44981079466241786,
"acc_stderr": 0.004964579685712441,
"acc_norm": 0.5802628958374826,
"acc_norm_stderr": 0.004925072159723828
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.041539484047424,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.041539484047424
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.32894736842105265,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.32894736842105265,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686781,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686781
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4075471698113208,
"acc_stderr": 0.030242233800854494,
"acc_norm": 0.4075471698113208,
"acc_norm_stderr": 0.030242233800854494
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.039420826399272135,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.039420826399272135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2832369942196532,
"acc_stderr": 0.034355680560478746,
"acc_norm": 0.2832369942196532,
"acc_norm_stderr": 0.034355680560478746
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.34893617021276596,
"acc_stderr": 0.031158522131357783,
"acc_norm": 0.34893617021276596,
"acc_norm_stderr": 0.031158522131357783
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489362,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489362
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2830687830687831,
"acc_stderr": 0.023201392938194974,
"acc_norm": 0.2830687830687831,
"acc_norm_stderr": 0.023201392938194974
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604674,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604674
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.35161290322580646,
"acc_stderr": 0.027162537826948458,
"acc_norm": 0.35161290322580646,
"acc_norm_stderr": 0.027162537826948458
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.22660098522167488,
"acc_stderr": 0.029454863835292992,
"acc_norm": 0.22660098522167488,
"acc_norm_stderr": 0.029454863835292992
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.40606060606060607,
"acc_stderr": 0.03834816355401181,
"acc_norm": 0.40606060606060607,
"acc_norm_stderr": 0.03834816355401181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4797979797979798,
"acc_stderr": 0.035594435655639196,
"acc_norm": 0.4797979797979798,
"acc_norm_stderr": 0.035594435655639196
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.47150259067357514,
"acc_stderr": 0.036025735712884414,
"acc_norm": 0.47150259067357514,
"acc_norm_stderr": 0.036025735712884414
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3487179487179487,
"acc_stderr": 0.024162780284017717,
"acc_norm": 0.3487179487179487,
"acc_norm_stderr": 0.024162780284017717
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.21481481481481482,
"acc_stderr": 0.025040443877000683,
"acc_norm": 0.21481481481481482,
"acc_norm_stderr": 0.025040443877000683
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.31932773109243695,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.31932773109243695,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360384,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360384
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.45504587155963305,
"acc_stderr": 0.021350503090925167,
"acc_norm": 0.45504587155963305,
"acc_norm_stderr": 0.021350503090925167
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.22685185185185186,
"acc_stderr": 0.02856165010242226,
"acc_norm": 0.22685185185185186,
"acc_norm_stderr": 0.02856165010242226
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.034542365853806094,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.034542365853806094
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.45569620253164556,
"acc_stderr": 0.03241920684693334,
"acc_norm": 0.45569620253164556,
"acc_norm_stderr": 0.03241920684693334
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.35874439461883406,
"acc_stderr": 0.03219079200419997,
"acc_norm": 0.35874439461883406,
"acc_norm_stderr": 0.03219079200419997
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4122137404580153,
"acc_stderr": 0.04317171194870254,
"acc_norm": 0.4122137404580153,
"acc_norm_stderr": 0.04317171194870254
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5537190082644629,
"acc_stderr": 0.0453793517794788,
"acc_norm": 0.5537190082644629,
"acc_norm_stderr": 0.0453793517794788
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.04820403072760627,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.04820403072760627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3312883435582822,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.3312883435582822,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285712,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285712
},
"harness|hendrycksTest-management|5": {
"acc": 0.4368932038834951,
"acc_stderr": 0.04911147107365777,
"acc_norm": 0.4368932038834951,
"acc_norm_stderr": 0.04911147107365777
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5769230769230769,
"acc_stderr": 0.032366121762202014,
"acc_norm": 0.5769230769230769,
"acc_norm_stderr": 0.032366121762202014
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.421455938697318,
"acc_stderr": 0.017657976412654857,
"acc_norm": 0.421455938697318,
"acc_norm_stderr": 0.017657976412654857
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.44508670520231214,
"acc_stderr": 0.02675625512966377,
"acc_norm": 0.44508670520231214,
"acc_norm_stderr": 0.02675625512966377
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24916201117318434,
"acc_stderr": 0.01446589382985993,
"acc_norm": 0.24916201117318434,
"acc_norm_stderr": 0.01446589382985993
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.46405228758169936,
"acc_stderr": 0.02855582751652879,
"acc_norm": 0.46405228758169936,
"acc_norm_stderr": 0.02855582751652879
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3665594855305466,
"acc_stderr": 0.02736807824397163,
"acc_norm": 0.3665594855305466,
"acc_norm_stderr": 0.02736807824397163
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.02743162372241502,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.02743162372241502
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2907801418439716,
"acc_stderr": 0.027090664368353178,
"acc_norm": 0.2907801418439716,
"acc_norm_stderr": 0.027090664368353178
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3259452411994785,
"acc_stderr": 0.01197150729498278,
"acc_norm": 0.3259452411994785,
"acc_norm_stderr": 0.01197150729498278
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.25735294117647056,
"acc_stderr": 0.026556519470041513,
"acc_norm": 0.25735294117647056,
"acc_norm_stderr": 0.026556519470041513
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3545751633986928,
"acc_stderr": 0.019353360547553714,
"acc_norm": 0.3545751633986928,
"acc_norm_stderr": 0.019353360547553714
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.38181818181818183,
"acc_stderr": 0.04653429807913509,
"acc_norm": 0.38181818181818183,
"acc_norm_stderr": 0.04653429807913509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5020408163265306,
"acc_stderr": 0.0320089533497105,
"acc_norm": 0.5020408163265306,
"acc_norm_stderr": 0.0320089533497105
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.3781094527363184,
"acc_stderr": 0.03428867848778657,
"acc_norm": 0.3781094527363184,
"acc_norm_stderr": 0.03428867848778657
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39156626506024095,
"acc_stderr": 0.03799857454479636,
"acc_norm": 0.39156626506024095,
"acc_norm_stderr": 0.03799857454479636
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4152046783625731,
"acc_stderr": 0.03779275945503201,
"acc_norm": 0.4152046783625731,
"acc_norm_stderr": 0.03779275945503201
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2729498164014688,
"mc1_stderr": 0.015594753632006518,
"mc2": 0.46641168216975853,
"mc2_stderr": 0.016269583261373614
},
"harness|winogrande|5": {
"acc": 0.5943172849250198,
"acc_stderr": 0.013800206336014203
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Back-up/chung-khoan-demo-12-final | ---
dataset_info:
features:
- name: url
dtype: string
- name: title
dtype: string
- name: date
dtype: string
- name: view
struct:
- name: number_of_response
dtype: string
- name: number_of_view
dtype: string
- name: content
list:
- name: res
dtype: string
splits:
- name: train
num_bytes: 33598506
num_examples: 6781
download_size: 12015438
dataset_size: 33598506
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
billsum | ---
annotations_creators:
- found
language_creators:
- found
language:
- en
license:
- cc0-1.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- summarization
task_ids: []
paperswithcode_id: billsum
pretty_name: BillSum
tags:
- bills-summarization
dataset_info:
features:
- name: text
dtype: string
- name: summary
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 219596090
num_examples: 18949
- name: test
num_bytes: 37866257
num_examples: 3269
- name: ca_test
num_bytes: 14945291
num_examples: 1237
download_size: 113729382
dataset_size: 272407638
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: ca_test
path: data/ca_test-*
train-eval-index:
- config: default
task: summarization
task_id: summarization
splits:
train_split: train
eval_split: test
col_mapping:
text: text
summary: target
metrics:
- type: rouge
name: Rouge
---
# Dataset Card for "billsum"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://github.com/FiscalNote/BillSum](https://github.com/FiscalNote/BillSum)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** https://aclanthology.org/D19-5406/
- **Paper:** https://arxiv.org/abs/1910.00523
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 67.26 MB
- **Size of the generated dataset:** 272.42 MB
- **Total amount of disk used:** 339.68 MB
### Dataset Summary
BillSum, summarization of US Congressional and California state bills.
There are several features:
- text: bill text.
- summary: summary of the bills.
- title: title of the bills.
features for us bills. ca bills does not have.
- text_len: number of chars in text.
- sum_len: number of chars in summary.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### default
- **Size of downloaded dataset files:** 67.26 MB
- **Size of the generated dataset:** 272.42 MB
- **Total amount of disk used:** 339.68 MB
An example of 'train' looks as follows.
```
{
"summary": "some summary",
"text": "some text.",
"title": "An act to amend Section xxx."
}
```
### Data Fields
The data fields are the same among all splits.
#### default
- `text`: a `string` feature.
- `summary`: a `string` feature.
- `title`: a `string` feature.
### Data Splits
| name |train|ca_test|test|
|-------|----:|------:|---:|
|default|18949| 1237|3269|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
The data consists of three parts: US training bills, US test bills and California test bills. The US bills were collected from the [Govinfo](https://github.com/unitedstates/congress) service provided by the United States Government Publishing Office (GPO) under CC0-1.0 license. The California, bills from the 2015-2016 session are available from the legislature’s [website](https://leginfo.legislature.ca.gov/).
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@inproceedings{kornilova-eidelman-2019-billsum,
title = "{B}ill{S}um: A Corpus for Automatic Summarization of {US} Legislation",
author = "Kornilova, Anastassia and
Eidelman, Vladimir",
editor = "Wang, Lu and
Cheung, Jackie Chi Kit and
Carenini, Giuseppe and
Liu, Fei",
booktitle = "Proceedings of the 2nd Workshop on New Frontiers in Summarization",
month = nov,
year = "2019",
address = "Hong Kong, China",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/D19-5406",
doi = "10.18653/v1/D19-5406",
pages = "48--56",
eprint={1910.00523},
archivePrefix={arXiv},
primaryClass={cs.CL},
}
```
### Contributions
Thanks to [@thomwolf](https://github.com/thomwolf), [@jplu](https://github.com/jplu), [@lewtun](https://github.com/lewtun) for adding this dataset. |
james-burton/OrientalMuseum_min6-mat-text | ---
dataset_info:
features:
- name: obj_num
dtype: string
- name: file
dtype: string
- name: image
dtype: image
- name: root
dtype: string
- name: description
dtype: string
- name: object_name
dtype: string
- name: other_name
dtype: string
- name: label
dtype:
class_label:
names:
'0': Animal Mummy
'1': Colour on Paper
'2': Flint/Chert
'3': Gouache on Paper
'4': Ink and Colour on Paper
'5': Ink and Colours on Silk
'6': Ink and Opaque Watercolour on Paper
'7': Ink on Paper
'8': Japanese paper
'9': Opaque Watercolour on Paper
'10': Opaque Watercolour or Gouache on Mica
'11': Pith
'12': Pith Paper
'13': Resin/Plastic
'14': Rhinoceros Horn
'15': Steatite/Soap Stone
'16': Watercolour on Rice Paper
'17': agate
'18': alabaster
'19': aluminum
'20': amber
'21': bamboo
'22': basalt
'23': bone
'24': brass
'25': bronze
'26': canvas
'27': cardboard
'28': cards
'29': carnelian
'30': ceramic
'31': clay
'32': copper
'33': copper alloy
'34': cotton
'35': earthenware
'36': faience
'37': flax
'38': flint
'39': glass
'40': gold
'41': granite
'42': gray ware
'43': hardwood
'44': horn
'45': ink
'46': iron
'47': ivory
'48': jade
'49': jasper
'50': lacquer
'51': lapis lazuli
'52': lead
'53': lead alloy
'54': leather
'55': limestone
'56': linen
'57': metal
'58': mother of pearl
'59': nephrite
'60': nylon
'61': paint
'62': paper
'63': papyrus
'64': photographic paper
'65': plaster
'66': plastic
'67': plate
'68': polyester
'69': porcelain
'70': pottery
'71': rattan
'72': rice paper
'73': sandstone
'74': satin
'75': schist
'76': serpentine
'77': shell
'78': silk
'79': silver
'80': soapstone
'81': steel
'82': stone
'83': stoneware
'84': stucco
'85': sycamore
'86': terracotta
'87': textiles
'88': travertine
'89': velvet
'90': wood
'91': wool
- name: production.period
dtype: string
- name: production.place
dtype: string
splits:
- name: train
num_bytes: 845879423.1426406
num_examples: 7362
- name: validation
num_bytes: 207904013.96767974
num_examples: 1733
- name: test
num_bytes: 193768714.5506797
num_examples: 1733
download_size: 1253546751
dataset_size: 1247552151.661
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
heegyu/glaive-function-calling-v2-ko | ---
license: apache-2.0
---
- Original Dataset: [glaiveai/glaive-function-calling-v2](https://huggingface.co/datasets/glaiveai/glaive-function-calling-v2)
- ChatGPT를 이용해서 번역, 전체 데이터셋 중 15000개만 번역됨
- Prompt:
```
You are a Korean translator. Data in the format of a given json array contains conversations between user and assistant. Each element in the array has roles and contents. You must translate the content value of the element when the role is user or assistant. You must also meet the following conditions.
1. The result must be preserved in json format.
2. The tone of the translated text should be a natural everyday conversation tone.
3. The translation content should not include the content that you are translating.
```
- 이후 데이터를 json 포멧으로 통째로 전달 |
pccl-org/formal-logic-simple-order-new-objects-paired-bigger-2000 | ---
dataset_info:
features:
- name: greater_than
dtype: string
- name: less_than
dtype: string
- name: paired_example
sequence:
sequence: string
- name: correct_example
sequence: string
- name: incorrect_example
sequence: string
- name: distance
dtype: int64
- name: index
dtype: int64
- name: index_in_distance
dtype: int64
splits:
- name: train
num_bytes: 505630324
num_examples: 1997003
download_size: 162719254
dataset_size: 505630324
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "formal-logic-simple-order-new-objects-paired-bigger-2000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KennethTM/squad_pairs_danish | ---
dataset_info:
features:
- name: query
dtype: string
- name: passage
dtype: string
splits:
- name: train
num_bytes: 69338889
num_examples: 87599
download_size: 11644151
dataset_size: 69338889
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- da
task_categories:
- feature-extraction
- question-answering
license: cc-by-sa-4.0
---
# SQuAD question-answer pairs in Danish
## About
This dataset is a version of the [SQuAD question-answer pairs dataset](https://huggingface.co/datasets/sentence-transformers/embedding-training-data) machine-translated from English to Danish ([link to original dataset](https://huggingface.co/datasets/squad)).
Machine translation is performed using the Helsinki NLP [English-to-Danish OPUS-MT model](https://huggingface.co/Helsinki-NLP/opus-mt-en-da).
The dataset contains ~87k question-answer pairs and can be used to train embedding and question-answer models. Each pair consists of one question ('query') and one passage containing the answer ('passage').
## Usage
Using the HuggingFace datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("KennethTM/squad_pairs_danish")
``` |
vinisebk/tina | ---
license: openrail
---
|
anyspeech/ucla_test | ---
dataset_info:
features:
- name: filename
dtype: string
- name: phones
dtype: string
- name: audio
struct:
- name: array
sequence: float64
- name: sampling_rate
dtype: int64
splits:
- name: train
num_bytes: 726465945
num_examples: 5444
download_size: 558156867
dataset_size: 726465945
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ucla_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-public_relations-original-neg | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 4442.872727272727
num_examples: 17
download_size: 7745
dataset_size: 4442.872727272727
---
# Dataset Card for "mmlu-public_relations-original-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
usvsnsp/deduped-embeddings | ---
dataset_info:
features:
- name: sequence_id
dtype: int64
- name: embeddings
sequence: float32
splits:
- name: train
num_bytes: 11138657220
num_examples: 7195515
download_size: 15591208109
dataset_size: 11138657220
---
# Dataset Card for "deduped-embeddings"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bdsaglam/webnlg-jerx-sft-multi-turn-openai | ---
dataset_info:
features:
- name: chat
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 13562315
num_examples: 17636
- name: dev
num_bytes: 1718829
num_examples: 2249
- name: test
num_bytes: 3051253
num_examples: 3668
download_size: 5347519
dataset_size: 18332397
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
|
Sleoruiz/discursos-quinta-class-separated-by-idx | ---
dataset_info:
features:
- name: text
dtype: string
- name: name
dtype: string
- name: comision
dtype: string
- name: gaceta_numero
dtype: string
- name: fecha_gaceta
dtype: string
- name: labels
sequence: string
- name: scores
sequence: float64
- name: idx
dtype: int64
splits:
- name: train
num_bytes: 21844473
num_examples: 13985
download_size: 10501093
dataset_size: 21844473
---
# Dataset Card for "discursos-quinta-class-separated-by-idx"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_abideen__AlphaMonarch-dora | ---
pretty_name: Evaluation run of abideen/AlphaMonarch-dora
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [abideen/AlphaMonarch-dora](https://huggingface.co/abideen/AlphaMonarch-dora)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abideen__AlphaMonarch-dora\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-03T07:07:17.386749](https://huggingface.co/datasets/open-llm-leaderboard/details_abideen__AlphaMonarch-dora/blob/main/results_2024-03-03T07-07-17.386749.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6504117306116304,\n\
\ \"acc_stderr\": 0.032199881590836504,\n \"acc_norm\": 0.6503766868895836,\n\
\ \"acc_norm_stderr\": 0.032867032729173594,\n \"mc1\": 0.627906976744186,\n\
\ \"mc1_stderr\": 0.01692109011881403,\n \"mc2\": 0.7802224048372912,\n\
\ \"mc2_stderr\": 0.013732560971719165\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7039249146757679,\n \"acc_stderr\": 0.013340916085246258,\n\
\ \"acc_norm\": 0.7320819112627986,\n \"acc_norm_stderr\": 0.012942030195136445\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.72176857199761,\n \
\ \"acc_stderr\": 0.004472121485161928,\n \"acc_norm\": 0.8925512846046604,\n\
\ \"acc_norm_stderr\": 0.003090499801090434\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n\
\ \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.7916666666666666,\n\
\ \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.04451807959055328,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.04451807959055328\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n\
\ \"acc_stderr\": 0.023785577884181015,\n \"acc_norm\": 0.7741935483870968,\n\
\ \"acc_norm_stderr\": 0.023785577884181015\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.41721854304635764,\n \"acc_stderr\": 0.040261414976346104,\n \"\
acc_norm\": 0.41721854304635764,\n \"acc_norm_stderr\": 0.040261414976346104\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507338,\n \"\
acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507338\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621126,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621126\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n\
\ \"acc_stderr\": 0.013816335389973136,\n \"acc_norm\": 0.8173690932311622,\n\
\ \"acc_norm_stderr\": 0.013816335389973136\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41787709497206704,\n\
\ \"acc_stderr\": 0.01649540063582008,\n \"acc_norm\": 0.41787709497206704,\n\
\ \"acc_norm_stderr\": 0.01649540063582008\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.025917806117147158,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.025917806117147158\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n\
\ \"acc_stderr\": 0.026311858071854155,\n \"acc_norm\": 0.6881028938906752,\n\
\ \"acc_norm_stderr\": 0.026311858071854155\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.024922001168886335,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.024922001168886335\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4830508474576271,\n\
\ \"acc_stderr\": 0.012762896889210867,\n \"acc_norm\": 0.4830508474576271,\n\
\ \"acc_norm_stderr\": 0.012762896889210867\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.627906976744186,\n\
\ \"mc1_stderr\": 0.01692109011881403,\n \"mc2\": 0.7802224048372912,\n\
\ \"mc2_stderr\": 0.013732560971719165\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8445146014206788,\n \"acc_stderr\": 0.01018430821477578\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6573161485974223,\n \
\ \"acc_stderr\": 0.013073030230827912\n }\n}\n```"
repo_url: https://huggingface.co/abideen/AlphaMonarch-dora
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|arc:challenge|25_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|gsm8k|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hellaswag|10_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T07-07-17.386749.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-03T07-07-17.386749.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- '**/details_harness|winogrande|5_2024-03-03T07-07-17.386749.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-03T07-07-17.386749.parquet'
- config_name: results
data_files:
- split: 2024_03_03T07_07_17.386749
path:
- results_2024-03-03T07-07-17.386749.parquet
- split: latest
path:
- results_2024-03-03T07-07-17.386749.parquet
---
# Dataset Card for Evaluation run of abideen/AlphaMonarch-dora
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abideen/AlphaMonarch-dora](https://huggingface.co/abideen/AlphaMonarch-dora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abideen__AlphaMonarch-dora",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-03T07:07:17.386749](https://huggingface.co/datasets/open-llm-leaderboard/details_abideen__AlphaMonarch-dora/blob/main/results_2024-03-03T07-07-17.386749.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6504117306116304,
"acc_stderr": 0.032199881590836504,
"acc_norm": 0.6503766868895836,
"acc_norm_stderr": 0.032867032729173594,
"mc1": 0.627906976744186,
"mc1_stderr": 0.01692109011881403,
"mc2": 0.7802224048372912,
"mc2_stderr": 0.013732560971719165
},
"harness|arc:challenge|25": {
"acc": 0.7039249146757679,
"acc_stderr": 0.013340916085246258,
"acc_norm": 0.7320819112627986,
"acc_norm_stderr": 0.012942030195136445
},
"harness|hellaswag|10": {
"acc": 0.72176857199761,
"acc_stderr": 0.004472121485161928,
"acc_norm": 0.8925512846046604,
"acc_norm_stderr": 0.003090499801090434
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7916666666666666,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.7916666666666666,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.04451807959055328,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.04451807959055328
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181015,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181015
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.41721854304635764,
"acc_stderr": 0.040261414976346104,
"acc_norm": 0.41721854304635764,
"acc_norm_stderr": 0.040261414976346104
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.01599015488507338,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.01599015488507338
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621126,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621126
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229143,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229143
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973136,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973136
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41787709497206704,
"acc_stderr": 0.01649540063582008,
"acc_norm": 0.41787709497206704,
"acc_norm_stderr": 0.01649540063582008
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.025917806117147158,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.025917806117147158
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.026311858071854155,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.026311858071854155
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.024922001168886335,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.024922001168886335
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4830508474576271,
"acc_stderr": 0.012762896889210867,
"acc_norm": 0.4830508474576271,
"acc_norm_stderr": 0.012762896889210867
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031208,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031208
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896308,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896308
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.627906976744186,
"mc1_stderr": 0.01692109011881403,
"mc2": 0.7802224048372912,
"mc2_stderr": 0.013732560971719165
},
"harness|winogrande|5": {
"acc": 0.8445146014206788,
"acc_stderr": 0.01018430821477578
},
"harness|gsm8k|5": {
"acc": 0.6573161485974223,
"acc_stderr": 0.013073030230827912
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
thu-coai/chid | ---
license: apache-2.0
language:
- zh
---
The ChID dataset. [GitHub repo](https://github.com/chujiezheng/ChID-Dataset). [Original paper](https://arxiv.org/abs/1906.01265).
```bib
@inproceedings{zheng-etal-2019-chid,
title = "{C}h{ID}: A Large-scale {C}hinese {ID}iom Dataset for Cloze Test",
author = "Zheng, Chujie and
Huang, Minlie and
Sun, Aixin",
booktitle = "ACL",
year = "2019"
}
``` |
PatrickHaller/hurt | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 777000
num_examples: 1000
- name: validation
num_bytes: 77700
num_examples: 100
download_size: 11908
dataset_size: 854700
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "hurt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-sparsity-30-v0.1 | ---
pretty_name: Evaluation run of wang7776/Mistral-7B-Instruct-v0.2-sparsity-30-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [wang7776/Mistral-7B-Instruct-v0.2-sparsity-30-v0.1](https://huggingface.co/wang7776/Mistral-7B-Instruct-v0.2-sparsity-30-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-sparsity-30-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-17T20:56:09.604059](https://huggingface.co/datasets/open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-sparsity-30-v0.1/blob/main/results_2024-01-17T20-56-09.604059.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6024881396252101,\n\
\ \"acc_stderr\": 0.03335765627539204,\n \"acc_norm\": 0.6070050987236348,\n\
\ \"acc_norm_stderr\": 0.03403883355182919,\n \"mc1\": 0.5079559363525091,\n\
\ \"mc1_stderr\": 0.017501285074551825,\n \"mc2\": 0.6627552049915408,\n\
\ \"mc2_stderr\": 0.015444533101130177\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5844709897610921,\n \"acc_stderr\": 0.014401366641216384,\n\
\ \"acc_norm\": 0.6331058020477816,\n \"acc_norm_stderr\": 0.0140841331181043\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6559450308703445,\n\
\ \"acc_stderr\": 0.004740882120999965,\n \"acc_norm\": 0.8436566421031667,\n\
\ \"acc_norm_stderr\": 0.0036243831208234508\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n\
\ \"acc_stderr\": 0.037940126746970296,\n \"acc_norm\": 0.5491329479768786,\n\
\ \"acc_norm_stderr\": 0.037940126746970296\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.04951218252396262,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.04951218252396262\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.03252909619613197,\n\
\ \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.03252909619613197\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n\
\ \"acc_stderr\": 0.04537815354939391,\n \"acc_norm\": 0.3684210526315789,\n\
\ \"acc_norm_stderr\": 0.04537815354939391\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520196,\n \"\
acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520196\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6709677419354839,\n \"acc_stderr\": 0.02672949906834996,\n \"\
acc_norm\": 0.6709677419354839,\n \"acc_norm_stderr\": 0.02672949906834996\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"\
acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.03477691162163659,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03477691162163659\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n\
\ \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5615384615384615,\n \"acc_stderr\": 0.025158266016868578,\n\
\ \"acc_norm\": 0.5615384615384615,\n \"acc_norm_stderr\": 0.025158266016868578\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6134453781512605,\n \"acc_stderr\": 0.03163145807552378,\n \
\ \"acc_norm\": 0.6134453781512605,\n \"acc_norm_stderr\": 0.03163145807552378\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7853211009174312,\n \"acc_stderr\": 0.01760430414925648,\n \"\
acc_norm\": 0.7853211009174312,\n \"acc_norm_stderr\": 0.01760430414925648\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538271,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538271\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7401960784313726,\n \"acc_stderr\": 0.03077855467869326,\n \"\
acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.03077855467869326\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455005,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455005\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677698,\n\
\ \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677698\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909476,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909476\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.04582124160161549,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.04582124160161549\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7879948914431673,\n\
\ \"acc_stderr\": 0.01461609938583367,\n \"acc_norm\": 0.7879948914431673,\n\
\ \"acc_norm_stderr\": 0.01461609938583367\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.02507071371915319,\n\
\ \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.02507071371915319\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3396648044692737,\n\
\ \"acc_stderr\": 0.015839400406212494,\n \"acc_norm\": 0.3396648044692737,\n\
\ \"acc_norm_stderr\": 0.015839400406212494\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.027057974624494382,\n\
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.027057974624494382\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6728395061728395,\n \"acc_stderr\": 0.026105673861409825,\n\
\ \"acc_norm\": 0.6728395061728395,\n \"acc_norm_stderr\": 0.026105673861409825\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666904,\n \
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666904\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4106910039113429,\n\
\ \"acc_stderr\": 0.01256487154253435,\n \"acc_norm\": 0.4106910039113429,\n\
\ \"acc_norm_stderr\": 0.01256487154253435\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6102941176470589,\n \"acc_stderr\": 0.0296246635811597,\n\
\ \"acc_norm\": 0.6102941176470589,\n \"acc_norm_stderr\": 0.0296246635811597\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6209150326797386,\n \"acc_stderr\": 0.019627444748412232,\n \
\ \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.019627444748412232\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n\
\ \"acc_stderr\": 0.03076944496729602,\n \"acc_norm\": 0.746268656716418,\n\
\ \"acc_norm_stderr\": 0.03076944496729602\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5079559363525091,\n\
\ \"mc1_stderr\": 0.017501285074551825,\n \"mc2\": 0.6627552049915408,\n\
\ \"mc2_stderr\": 0.015444533101130177\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7805840568271507,\n \"acc_stderr\": 0.01163126836060778\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.39423805913570886,\n \
\ \"acc_stderr\": 0.013460852357095656\n }\n}\n```"
repo_url: https://huggingface.co/wang7776/Mistral-7B-Instruct-v0.2-sparsity-30-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|arc:challenge|25_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|gsm8k|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hellaswag|10_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T20-56-09.604059.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-17T20-56-09.604059.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- '**/details_harness|winogrande|5_2024-01-17T20-56-09.604059.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-17T20-56-09.604059.parquet'
- config_name: results
data_files:
- split: 2024_01_17T20_56_09.604059
path:
- results_2024-01-17T20-56-09.604059.parquet
- split: latest
path:
- results_2024-01-17T20-56-09.604059.parquet
---
# Dataset Card for Evaluation run of wang7776/Mistral-7B-Instruct-v0.2-sparsity-30-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [wang7776/Mistral-7B-Instruct-v0.2-sparsity-30-v0.1](https://huggingface.co/wang7776/Mistral-7B-Instruct-v0.2-sparsity-30-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-sparsity-30-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-17T20:56:09.604059](https://huggingface.co/datasets/open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-sparsity-30-v0.1/blob/main/results_2024-01-17T20-56-09.604059.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6024881396252101,
"acc_stderr": 0.03335765627539204,
"acc_norm": 0.6070050987236348,
"acc_norm_stderr": 0.03403883355182919,
"mc1": 0.5079559363525091,
"mc1_stderr": 0.017501285074551825,
"mc2": 0.6627552049915408,
"mc2_stderr": 0.015444533101130177
},
"harness|arc:challenge|25": {
"acc": 0.5844709897610921,
"acc_stderr": 0.014401366641216384,
"acc_norm": 0.6331058020477816,
"acc_norm_stderr": 0.0140841331181043
},
"harness|hellaswag|10": {
"acc": 0.6559450308703445,
"acc_stderr": 0.004740882120999965,
"acc_norm": 0.8436566421031667,
"acc_norm_stderr": 0.0036243831208234508
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.037940126746970296,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.037940126746970296
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.04951218252396262,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.04951218252396262
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.03252909619613197,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.03252909619613197
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939391,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939391
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520196,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520196
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6709677419354839,
"acc_stderr": 0.02672949906834996,
"acc_norm": 0.6709677419354839,
"acc_norm_stderr": 0.02672949906834996
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03477691162163659,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03477691162163659
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5615384615384615,
"acc_stderr": 0.025158266016868578,
"acc_norm": 0.5615384615384615,
"acc_norm_stderr": 0.025158266016868578
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6134453781512605,
"acc_stderr": 0.03163145807552378,
"acc_norm": 0.6134453781512605,
"acc_norm_stderr": 0.03163145807552378
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7853211009174312,
"acc_stderr": 0.01760430414925648,
"acc_norm": 0.7853211009174312,
"acc_norm_stderr": 0.01760430414925648
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.03077855467869326,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.03077855467869326
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455005,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455005
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6870229007633588,
"acc_stderr": 0.04066962905677698,
"acc_norm": 0.6870229007633588,
"acc_norm_stderr": 0.04066962905677698
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909476,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909476
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650743,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650743
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.04582124160161549,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.04582124160161549
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7879948914431673,
"acc_stderr": 0.01461609938583367,
"acc_norm": 0.7879948914431673,
"acc_norm_stderr": 0.01461609938583367
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.02507071371915319,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.02507071371915319
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3396648044692737,
"acc_stderr": 0.015839400406212494,
"acc_norm": 0.3396648044692737,
"acc_norm_stderr": 0.015839400406212494
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.027057974624494382,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.027057974624494382
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6728395061728395,
"acc_stderr": 0.026105673861409825,
"acc_norm": 0.6728395061728395,
"acc_norm_stderr": 0.026105673861409825
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666904,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4106910039113429,
"acc_stderr": 0.01256487154253435,
"acc_norm": 0.4106910039113429,
"acc_norm_stderr": 0.01256487154253435
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6102941176470589,
"acc_stderr": 0.0296246635811597,
"acc_norm": 0.6102941176470589,
"acc_norm_stderr": 0.0296246635811597
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.019627444748412232,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.019627444748412232
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.746268656716418,
"acc_stderr": 0.03076944496729602,
"acc_norm": 0.746268656716418,
"acc_norm_stderr": 0.03076944496729602
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5079559363525091,
"mc1_stderr": 0.017501285074551825,
"mc2": 0.6627552049915408,
"mc2_stderr": 0.015444533101130177
},
"harness|winogrande|5": {
"acc": 0.7805840568271507,
"acc_stderr": 0.01163126836060778
},
"harness|gsm8k|5": {
"acc": 0.39423805913570886,
"acc_stderr": 0.013460852357095656
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
irds/msmarco-document-v2_trec-dl-2020_judged | ---
pretty_name: '`msmarco-document-v2/trec-dl-2020/judged`'
viewer: false
source_datasets: ['irds/msmarco-document-v2', 'irds/msmarco-document-v2_trec-dl-2020']
task_categories:
- text-retrieval
---
# Dataset Card for `msmarco-document-v2/trec-dl-2020/judged`
The `msmarco-document-v2/trec-dl-2020/judged` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/msmarco-document-v2#msmarco-document-v2/trec-dl-2020/judged).
# Data
This dataset provides:
- `queries` (i.e., topics); count=45
- For `docs`, use [`irds/msmarco-document-v2`](https://huggingface.co/datasets/irds/msmarco-document-v2)
- For `qrels`, use [`irds/msmarco-document-v2_trec-dl-2020`](https://huggingface.co/datasets/irds/msmarco-document-v2_trec-dl-2020)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/msmarco-document-v2_trec-dl-2020_judged', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Craswell2020TrecDl,
title={Overview of the TREC 2020 deep learning track},
author={Nick Craswell and Bhaskar Mitra and Emine Yilmaz and Daniel Campos},
booktitle={TREC},
year={2020}
}
@inproceedings{Bajaj2016Msmarco,
title={MS MARCO: A Human Generated MAchine Reading COmprehension Dataset},
author={Payal Bajaj, Daniel Campos, Nick Craswell, Li Deng, Jianfeng Gao, Xiaodong Liu, Rangan Majumder, Andrew McNamara, Bhaskar Mitra, Tri Nguyen, Mir Rosenberg, Xia Song, Alina Stoica, Saurabh Tiwary, Tong Wang},
booktitle={InCoCo@NIPS},
year={2016}
}
```
|
marcusy/qqq | ---
license: mit
task_categories:
- translation
language:
- en
size_categories:
- 1K<n<10K
--- |
factckbr | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- pt
license:
- mit
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- fact-checking
pretty_name: FACTCK BR
dataset_info:
features:
- name: url
dtype: string
- name: author
dtype: string
- name: date
dtype: string
- name: claim
dtype: string
- name: review
dtype: string
- name: title
dtype: string
- name: rating
dtype: float32
- name: best_rating
dtype: float32
- name: label
dtype:
class_label:
names:
'0': falso
'1': distorcido
'2': impreciso
'3': exagerado
'4': insustentável
'5': verdadeiro
'6': outros
'7': subestimado
'8': impossível provar
'9': discutível
'10': sem contexto
'11': de olho
'12': verdadeiro, mas
'13': ainda é cedo para dizer
splits:
- name: train
num_bytes: 750646
num_examples: 1313
download_size: 721314
dataset_size: 750646
---
# Dataset Card for FACTCK BR
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/jghm-f/FACTCK.BR
- **Repository:** https://github.com/jghm-f/FACTCK.BR
- **Paper:** https://dl.acm.org/doi/10.1145/3323503.3361698
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
A dataset to study Fake News in Portuguese, presenting a supposedly false News along with their respective fact check and classification.
The data is collected from the ClaimReview, a structured data schema used by fact check agencies to share their results in search engines, enabling data collect in real time.
The FACTCK.BR dataset contains 1309 claims with its corresponding label.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@hugoabonizio](https://github.com/hugoabonizio) for adding this dataset. |
CyberHarem/zas_m21_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of zas_m21/ZasM21/ZasM21 (Girls' Frontline)
This is the dataset of zas_m21/ZasM21/ZasM21 (Girls' Frontline), containing 121 images and their tags.
The core tags of this character are `short_hair, blue_hair, bangs, orange_eyes, earrings, eyewear_on_head, goggles_on_head, ahoge`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 121 | 161.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zas_m21_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 121 | 84.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zas_m21_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 287 | 178.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zas_m21_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 121 | 138.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zas_m21_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 287 | 259.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zas_m21_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/zas_m21_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, bare_shoulders, black_gloves, choker, fingerless_gloves, looking_at_viewer, nail_polish, solo, simple_background, collarbone, multicolored_nails, orange_nails, upper_body, blue_nails, single_earring, orange_goggles, white_background, closed_mouth, off_shoulder, necktie |
| 1 | 10 |  |  |  |  |  | 1girl, bare_shoulders, choker, solo, assault_rifle, fingerless_gloves, jewelry, nail_polish, black_gloves, boots, looking_at_viewer, sitting, mismatched_legwear, multicolored_nails, orange_goggles, uneven_legwear, black_footwear, blue_nails, character_name, garter_straps, holding_gun, orange_nails, simple_background, striped_thighhighs |
| 2 | 15 |  |  |  |  |  | bare_shoulders, white_dress, 1girl, looking_at_viewer, official_alternate_costume, solo, yellow_eyes, collarbone, wedding_dress, bridal_veil, hair_flower, holding, white_gloves, blush, choker, elbow_gloves, breasts, closed_mouth, white_flower |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | black_gloves | choker | fingerless_gloves | looking_at_viewer | nail_polish | solo | simple_background | collarbone | multicolored_nails | orange_nails | upper_body | blue_nails | single_earring | orange_goggles | white_background | closed_mouth | off_shoulder | necktie | assault_rifle | jewelry | boots | sitting | mismatched_legwear | uneven_legwear | black_footwear | character_name | garter_straps | holding_gun | striped_thighhighs | white_dress | official_alternate_costume | yellow_eyes | wedding_dress | bridal_veil | hair_flower | holding | white_gloves | blush | elbow_gloves | breasts | white_flower |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:---------------|:---------|:--------------------|:--------------------|:--------------|:-------|:--------------------|:-------------|:---------------------|:---------------|:-------------|:-------------|:-----------------|:-----------------|:-------------------|:---------------|:---------------|:----------|:----------------|:----------|:--------|:----------|:---------------------|:-----------------|:-----------------|:-----------------|:----------------|:--------------|:---------------------|:--------------|:-----------------------------|:--------------|:----------------|:--------------|:--------------|:----------|:---------------|:--------|:---------------|:----------|:---------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | X | X | | X | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 2 | 15 |  |  |  |  |  | X | X | | X | | X | | X | | X | | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
richfrain/semanticSegmentation | ---
license: apache-2.0
---
|
clinicalnlplab/medQA_test | ---
dataset_info:
features:
- name: id
dtype: string
- name: query
dtype: string
- name: answer
dtype: string
- name: choices
sequence: string
- name: gold
dtype: int64
splits:
- name: train
num_bytes: 1887041
num_examples: 1273
- name: valid
num_bytes: 1887041
num_examples: 1273
- name: test
num_bytes: 1887041
num_examples: 1273
download_size: 2276631
dataset_size: 5661123
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
---
|
Fraser/python-state-changes | ---
language:
- code
---
# Python State Changes
State changes from the execution of single lines of Python code.
All code was taken from Python HackerRank solutions.
Scraped from my dataset of traced HackerRank solutions. https://www.kaggle.com/frasergreenlee/ran-hackerrank-solutions
```json
{"start": "g = 100; i = 1; l = [100, 100, 0, 0, -100, -100]", "code": "g += l[i]", "end": "g = 200; i = 1; l = [100, 100, 0, 0, -100, -100]"}
{"start": "a = 1; b = 2; d = 4; i = 3; j = 2", "code": "i, j = a + (j - b), b + (d - (i - a))", "end": "a = 1; b = 2; d = 4; i = 1; j = 4"}
{"start": "b = 15", "code": "b = b // 2", "end": "b = 7"}
```
## Get an overview of the dataset from seeing the frequency of different ASTs.
👉 https://observablehq.com/@frasergreenlee/python-lines-dataset#chart |
knowgen/Manufacturing_IT | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1542998178
num_examples: 559796
download_size: 970960708
dataset_size: 1542998178
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Gummybear05/speed_changed | ---
dataset_info:
features:
- name: path
dtype: string
- name: filename
dtype: string
- name: text
dtype: string
- name: quality
dtype: string
- name: city
dtype: string
- name: gender
dtype: string
- name: age
dtype: string
- name: array
sequence: float64
- name: sample_rate
dtype: int64
splits:
- name: train
num_bytes: 9616935248
num_examples: 8531
- name: test
num_bytes: 258512151
num_examples: 120
download_size: 2030378461
dataset_size: 9875447399
---
# Dataset Card for "speed_changed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
barissglc/tarot | ---
language:
- en
--- |
drak247/Sinomacrops | ---
license: unknown
---
|
sayak0809/mentalhealth | ---
license: unknown
---
|
RikoteMaster/goemotion_4_llama2_v2 | ---
dataset_info:
features:
- name: Text_processed
dtype: string
- name: Emotion
dtype: string
- name: text
dtype: string
- name: Augmented
dtype: bool
splits:
- name: train
num_bytes: 12984427
num_examples: 36324
download_size: 4425317
dataset_size: 12984427
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "goemotion_4_llama2_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shossain/govreport-qa-512 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 33340
num_examples: 5
download_size: 15680
dataset_size: 33340
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "govreport-qa-512"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dr0l3/common_voice_da | ---
license: apache-2.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 4465399184
num_examples: 4649
- name: test
num_bytes: 2048786368
num_examples: 2133
download_size: 1032191895
dataset_size: 6514185552
---
|
youdiniplays/tl-bic | ---
license: mit
task_categories:
- translation
language:
- tl
--- |
autoevaluate/autoeval-staging-eval-project-f69c187c-a1f8-462d-8272-41a77bd1f8ed-97 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- glue
eval_info:
task: binary_classification
model: autoevaluate/binary-classification
metrics: ['matthews_correlation']
dataset_name: glue
dataset_config: sst2
dataset_split: validation
col_mapping:
text: sentence
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Binary Text Classification
* Model: autoevaluate/binary-classification
* Dataset: glue
* Config: sst2
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
joe1984/palabras | ---
license: apache-2.0
---
|
helloelwin/w2sg-generations | ---
dataset_info:
- config_name: eval_train_weak-bs=16-dn=gsm8k-e=10-ee=1000000-l=xent-l=2e-05-ls=cosi_anne-ml=331-ms=gemma-2b-nd=20000-ntd=10000-o=adam-s=0-twd=0-epoch=2
features:
- name: question
dtype: string
- name: gt_answer
dtype: string
- name: answer
dtype: string
- name: acc
dtype: float64
splits:
- name: train
num_bytes: 3262475
num_examples: 3736
download_size: 1748441
dataset_size: 3262475
- config_name: eval_train_weak-bs=16-dn=gsm8k-e=10-ee=1000000-l=xent-l=2e-05-ls=cosi_anne-ml=331-ms=gpt2-xl-nd=20000-ntd=10000-o=adam-s=0-twd=0-epoch=2
features:
- name: question
dtype: string
- name: gt_answer
dtype: string
- name: answer
dtype: string
- name: acc
dtype: float64
splits:
- name: train
num_bytes: 3557779
num_examples: 3736
download_size: 1762583
dataset_size: 3557779
configs:
- config_name: eval_train_weak-bs=16-dn=gsm8k-e=10-ee=1000000-l=xent-l=2e-05-ls=cosi_anne-ml=331-ms=gemma-2b-nd=20000-ntd=10000-o=adam-s=0-twd=0-epoch=2
data_files:
- split: train
path: eval_train_weak-bs=16-dn=gsm8k-e=10-ee=1000000-l=xent-l=2e-05-ls=cosi_anne-ml=331-ms=gemma-2b-nd=20000-ntd=10000-o=adam-s=0-twd=0-epoch=2/train-*
- config_name: eval_train_weak-bs=16-dn=gsm8k-e=10-ee=1000000-l=xent-l=2e-05-ls=cosi_anne-ml=331-ms=gpt2-xl-nd=20000-ntd=10000-o=adam-s=0-twd=0-epoch=2
data_files:
- split: train
path: eval_train_weak-bs=16-dn=gsm8k-e=10-ee=1000000-l=xent-l=2e-05-ls=cosi_anne-ml=331-ms=gpt2-xl-nd=20000-ntd=10000-o=adam-s=0-twd=0-epoch=2/train-*
---
|
atmallen/mmlu_binary | ---
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype: int32
- name: statement
dtype: string
- name: label
dtype:
class_label:
names:
'0': 'false'
'1': 'true'
splits:
- name: validation
num_bytes: 653717
num_examples: 1218
- name: test
num_bytes: 5979564
num_examples: 11526
download_size: 3456524
dataset_size: 6633281
---
# Dataset Card for "mmlu_binary"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
marccgrau/sbbdata_snr_0 | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 499781309.0
num_examples: 1121
- name: test
num_bytes: 63201474.0
num_examples: 142
- name: val
num_bytes: 62608885.0
num_examples: 141
download_size: 620800482
dataset_size: 625591668.0
---
# Dataset Card for "sbbdata_snr_0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
seungheondoh/LP-MusicCaps-MTT | ---
license: mit
language:
- en
tags:
- art
- music
- text-to-music
- music-to-text
pretty_name: LP-MusicCaps-MTT
size_categories:
- 10K<n<100K
---
======================================
**!important**: Be careful when using `caption_attribute_prediction` (We don't recommend to use)!
======================================
# Dataset Card for LP-MusicCaps-MTT
## Dataset Description
- **Repository:** [LP-MusicCaps repository](https://github.com/seungheondoh/lp-music-caps)
- **Paper:** [ArXiv](https://arxiv.org/abs/2307.16372)
## Dataset Summary
**LP-MusicCaps** is a Large Language Model based Pseudo Music Caption dataset for `text-to-music` and `music-to-text` tasks. We construct the music-to-caption pairs with tag-to-caption generation (using three existing multi-label tag datasets and four task instructions). The data sources are MusicCaps, Magnatagtune, and Million Song Dataset ECALS subset.
- **LP-MusicCaps MTT (This Repo)**: 22k Audio with 88k Caption. We utilize 188 unique tags in the [Magnatagtune](https://mirg.city.ac.uk/codeapps/the-magnatagatune-dataset) to perform tag-to-caption generation through LLM. Magnatagtune consists of 26k music clips from 5,223 unique songs including genre, instrument, vocal, mood, perceptual tempo, origin, and sonority features. We used the full 188 tag vocabulary and did not generate captions for tracks that do not have associated tags (decreased to 22k).
- [LP-MusicCaps MSD](https://huggingface.co/datasets/seungheondoh/LP-MusicCaps-MSD): 0.5M Audio with 2.2M Caption
- [LP-MusicCaps MC](https://huggingface.co/datasets/seungheondoh/LP-MusicCaps-MC): 6k Audio with 22k Caption.
## Data Instances
Each instance in LP-MusicCaps MTT (This Repo) represents multiple image-text pair information with meta-attributes:
```
{
'track_id': '1541',
'title': 'Eyes Closed (The Seldon Plan)',
'artist_name': 'Magnatune.com',
'release': 'Magnatune At The CC Salon',
'tag_top50': ['guitar', 'country', 'male', 'singing'],
'tag_top188': ['guitar',
'male singer',
'country',
'male vocals',
'male',
'singing'
],
'caption_writing': 'This country song features twangy guitar riffs and heartfelt male vocals, with a male singer singing about love and loss.',
'caption_summary': 'A male singer with a country style voice accompanies his guitar while singing.',
'caption_paraphrase': 'This male artist croons in a deep, soulful voice over the twangy sounds of his guitar, crafting a classic country tune perfect for fans of male vocals and raw, authentic singing.',
'caption_attribute_prediction': 'A twangy mix of acoustic guitar and male vocals come together in this heartfelt country song. With lyrics that evoke a sense of nostalgia, the male singer weaves a story of love and loss through his storytelling. His emotive singing grips you from start to finish, as he sings about the trials and tribulations of life. This song is a must-listen for any fan of country.',
'pseudo_attribute': ['acoustic',
'twangy',
'heartfelt',
'storytelling',
'nostalgic'
],
'path': 'e/magnatune_com-magnatune_at_the_cc_salon-01-eyes_closed_the_seldon_plan-30-59.mp3'
}
```
## Pseudo Caption Example:
Input Tags:
*"video game theme, no singer, instrumental, analog sounding, small keyboard, beatboxing, playful, cheerful, groovy"*
Output Pseudo Captions
*"instrumental track has a joyful and playful vibe, perfect for a video game theme. With no singer, the analog-sounding music features a small keyboard and beatboxing, creating a groovy and cheerful atmosphere"*
[More Information for pseudo caption generation](https://github.com/seungheondoh/lp-music-caps/blob/main/lpmc/llm_captioning/generate.py)
## Data Fields
| Name | Type | Description |
|------------------------------|-----------------|----------------------------------------------------------------------|
| track_id | string | Unique identifier for the track |
| title | string | Title of the song |
| artist_name | string | Name of the artist performing the song |
| release | string | Release name or album name of the song |
| tag_top50 | list of strings | List of top 50 tags associated with the song |
| tag_top188 | list of strings | List of top 188 tags associated with the song |
| caption_writing | string | Pseudo caption generated through a writing instruction |
| caption_summary | string | Pseudo caption generated through a summary instruction |
| caption_paraphrase | string | Pseudo caption generated through a paraphrase instruction |
| caption_attribute_prediction | string | Pseudo caption generated through an attribute_prediction instruction |
| pseudo_attribute | list of strings | List of pseudo-attributes used in caption_attribute_prediction |
| path | string | File path or location of the audio clip |
## Data Splits
We used the full 188 tag vocabulary and did not generate captions for tracks that do not have associated tags (26k => 22k). 4K examples have empty tag and caption.
- train: 18706
- valid: 1825
- test: 5329
## Considerations for Using the Data
The LP-MusicCaps dataset is recommended to be used for research purposes. Due to the wrong labeling issue, we recommend not using caption_attribute_prediction and pseudo_attribute unless it is specifically for large-scale pretraining. Additionally, the field "is_crawled" indicates the samples used in the reference paper mentioned below.
## Discussion of Biases
It will be described in a paper to be released soon.
## Other Known Limitations
It will be described in a paper to be released soon. |
ArturoHurtado7/AntiSpoofing | ---
dataset_info:
features:
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 14202353920.944
num_examples: 80816
download_size: 4585467118
dataset_size: 14202353920.944
---
# Dataset Card for "AntiSpoofing"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lbls888/lbls | ---
license: apache-2.0
---
|
AIJUUD/test_data2 | ---
license: other
---
test |
daisy-o/images | ---
language:
- en
--- |
davanstrien/AiGen-FoodReview | ---
dataset_info:
features:
- name: ID
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
- name: automated_readability_index
dtype: float64
- name: difficult_words
dtype: int64
- name: flesch_reading_ease
dtype: float64
- name: gunning_fog
dtype: float64
- name: words_per_sentence
dtype: float64
- name: reading_time
dtype: float64
- name: ppl
dtype: float64
- name: bright
dtype: float64
- name: cont
dtype: float64
- name: warm
dtype: float64
- name: colorf
dtype: float64
- name: sd
dtype: float64
- name: cd
dtype: float64
- name: td
dtype: float64
- name: diag_dom
dtype: float64
- name: rot
dtype: float64
- name: hpvb
dtype: float64
- name: vpvb
dtype: float64
- name: hcvb
dtype: float64
- name: vcvb
dtype: float64
- name: sat
dtype: float64
- name: clar
dtype: float64
- name: image
dtype: image
splits:
- name: train
num_bytes: 1260144919.2
num_examples: 12086
- name: test
num_bytes: 432615568.19
num_examples: 4030
- name: valid
num_bytes: 440698812.212
num_examples: 4028
download_size: 1836929866
dataset_size: 2133459299.6020002
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
license: mit
language:
- en
pretty_name: >-
AiGen-FoodReview: A Multimodal Dataset of Machine-Generated Restaurant Reviews
and Images on Social Media
--- |
liuyanchen1015/MULTI_VALUE_mnli_zero_plural | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 1270270
num_examples: 5513
- name: dev_mismatched
num_bytes: 1352879
num_examples: 5665
- name: test_matched
num_bytes: 1274026
num_examples: 5518
- name: test_mismatched
num_bytes: 1351991
num_examples: 5714
- name: train
num_bytes: 50904287
num_examples: 219027
download_size: 36609054
dataset_size: 56153453
---
# Dataset Card for "MULTI_VALUE_mnli_zero_plural"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HuggingFaceM4/NLVR2_support_query_sets | Invalid username or password. |
renumics/beans-outlier | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- mit
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- extended
task_categories:
- image-classification
task_ids:
- multi-class-image-classification
pretty_name: Beans
dataset_info:
features:
- name: image_file_path
dtype: string
- name: image
dtype: image
- name: labels
dtype:
class_label:
names:
'0': angular_leaf_spot
'1': bean_rust
'2': healthy
- name: embedding_foundation
sequence: float32
- name: embedding_ft
sequence: float32
- name: outlier_score_ft
dtype: float64
- name: outlier_score_foundation
dtype: float64
- name: nn_image
dtype: image
splits:
- name: train
num_bytes: 293531811.754
num_examples: 1034
download_size: 0
dataset_size: 293531811.754
---
# Dataset Card for "beans-outlier"
📚 This dataset is an enhancved version of the [ibean project of the AIR lab](https://github.com/AI-Lab-Makerere/ibean/).
The workflow is described in the medium article: [Changes of Embeddings during Fine-Tuning of Transformers](https://medium.com/@markus.stoll/changes-of-embeddings-during-fine-tuning-c22aa1615921).
## Explore the Dataset
The open source data curation tool [Renumics Spotlight](https://github.com/Renumics/spotlight) allows you to explorer this dataset. You can find a Hugging Face Space running Spotlight with this dataset here: <https://huggingface.co/spaces/renumics/beans-outlier>

Or you can explorer it locally:
```python
!pip install renumics-spotlight datasets
from renumics import spotlight
import datasets
ds = datasets.load_dataset("renumics/beansoutlier", split="train")
df = ds.to_pandas()
df["label_str"] = df["labels"].apply(lambda x: ds.features["labels"].int2str(x))
dtypes = {
"nn_image": spotlight.Image,
"image": spotlight.Image,
"embedding_ft": spotlight.Embedding,
"embedding_foundation": spotlight.Embedding,
}
spotlight.show(
df,
dtype=dtypes,
layout="https://spotlight.renumics.com/resources/layout_pre_post_ft.json",
)
``` |
phyloforfun/HLT_MICH_Angiospermae_SLTPvA_v1-0__OCR-C25-L25-E25-R05 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 16878481512
num_examples: 10134076
download_size: 1579045698
dataset_size: 16878481512
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: cc-by-sa-4.0
---
SLTPvA
Dataset:
- Alpaca format
- All MICH Angiospermae entries as of 28-11-2023 (v1-0)
Synthetic OCR:
- C25 25% of cells will be randomly ALL CAPS
- L25 25% of cells will be randomly all lowercase
- E25 25% of all rows will be subjected to synthetic OCR augmentation
- R05 5% chance that a given character in an OCR augmentation row will undergo substitution, deletion, insertion errors
- Synthetic OCR augmentation rows also have random strings inserted sporadically to simulate OCR noise
System message:
Refactor the unstructured text into a valid JSON dictionary. The key names follow the Darwin Core Archive Standard. If a key lacks content, then insert an empty string. Fill in the following JSON structure as required: {\"catalogNumber\": \"\", \"order\": \"\", \"family\": \"\", \"scientificName\": \"\", \"scientificNameAuthorship\": \"\", \"genus\": \"\", \"subgenus\": \"\", \"specificEpithet\": \"\", \"verbatimTaxonRank\": \"\", \"infraspecificEpithet\": \"\", \"identifiedBy\": \"\", \"recordedBy\": \"\", \"recordNumber\": \"\", \"verbatimEventDate\": \"\", \"habitat\": \"\", \"occurrenceRemarks\": \"\", \"associatedTaxa\": \"\", \"country\": \"\", \"stateProvince\": \"\", \"county\": \"\", \"municipality\": \"\", \"locality\": \"\", \"decimalLatitude\": \"\", \"decimalLongitude\": \"\", \"verbatimCoordinates\": \"\", \"minimumElevationInMeters\": \"\", \"maximumElevationInMeters\": \"\"}
JSON format:
{
"catalogNumber": "",
"order": "",
"family": "",
"scientificName": "",
"scientificNameAuthorship": "",
"genus": "",
"subgenus": "",
"specificEpithet": "",
"verbatimTaxonRank": "",
"infraspecificEpithet": "",
"identifiedBy": "",
"recordedBy": "",
"recordNumber": "",
"verbatimEventDate": "",
"habitat": "",
"occurrenceRemarks": "",
"associatedTaxa": "",
"country": "",
"stateProvince": "",
"county": "",
"municipality": "",
"locality": "",
"decimalLatitude": "",
"decimalLongitude": "",
"verbatimCoordinates": "",
"minimumElevationInMeters": "",
"maximumElevationInMeters": ""
} |
open-llm-leaderboard/details_ChaoticNeutrals__This_is_fine_7B | ---
pretty_name: Evaluation run of ChaoticNeutrals/This_is_fine_7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ChaoticNeutrals/This_is_fine_7B](https://huggingface.co/ChaoticNeutrals/This_is_fine_7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ChaoticNeutrals__This_is_fine_7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-23T06:36:13.280436](https://huggingface.co/datasets/open-llm-leaderboard/details_ChaoticNeutrals__This_is_fine_7B/blob/main/results_2024-02-23T06-36-13.280436.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6492573273816704,\n\
\ \"acc_stderr\": 0.032258620394029484,\n \"acc_norm\": 0.6499843683422565,\n\
\ \"acc_norm_stderr\": 0.032916787491800534,\n \"mc1\": 0.4944920440636475,\n\
\ \"mc1_stderr\": 0.017502438990451067,\n \"mc2\": 0.6578951945217267,\n\
\ \"mc2_stderr\": 0.01514481956289198\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.681740614334471,\n \"acc_stderr\": 0.013611993916971453,\n\
\ \"acc_norm\": 0.7030716723549488,\n \"acc_norm_stderr\": 0.013352025976725225\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7059350726946824,\n\
\ \"acc_stderr\": 0.004546901132945115,\n \"acc_norm\": 0.8728340967934675,\n\
\ \"acc_norm_stderr\": 0.003324778429495356\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952928,\n\
\ \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952928\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110175,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110175\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923992,\n \"\
acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923992\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n\
\ \"acc_stderr\": 0.022891687984554956,\n \"acc_norm\": 0.7967741935483871,\n\
\ \"acc_norm_stderr\": 0.022891687984554956\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229865,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229865\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6794871794871795,\n \"acc_stderr\": 0.02366129639396428,\n \
\ \"acc_norm\": 0.6794871794871795,\n \"acc_norm_stderr\": 0.02366129639396428\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.02983796238829193,\n \
\ \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.02983796238829193\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467617,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467617\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728743,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728743\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834834,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834834\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323374,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323374\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.49273743016759775,\n\
\ \"acc_stderr\": 0.01672073740517951,\n \"acc_norm\": 0.49273743016759775,\n\
\ \"acc_norm_stderr\": 0.01672073740517951\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02526169121972948,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02526169121972948\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7427652733118971,\n\
\ \"acc_stderr\": 0.024826171289250888,\n \"acc_norm\": 0.7427652733118971,\n\
\ \"acc_norm_stderr\": 0.024826171289250888\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765137,\n\
\ \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765137\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46479791395045633,\n\
\ \"acc_stderr\": 0.012738547371303956,\n \"acc_norm\": 0.46479791395045633,\n\
\ \"acc_norm_stderr\": 0.012738547371303956\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6421568627450981,\n \"acc_stderr\": 0.019393058402355442,\n \
\ \"acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.019393058402355442\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.029162738410249772,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.029162738410249772\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197772,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197772\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061452,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061452\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4944920440636475,\n\
\ \"mc1_stderr\": 0.017502438990451067,\n \"mc2\": 0.6578951945217267,\n\
\ \"mc2_stderr\": 0.01514481956289198\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8161010260457774,\n \"acc_stderr\": 0.010887916013305896\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6277482941622441,\n \
\ \"acc_stderr\": 0.013315375362565038\n }\n}\n```"
repo_url: https://huggingface.co/ChaoticNeutrals/This_is_fine_7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|arc:challenge|25_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|gsm8k|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hellaswag|10_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T06-36-13.280436.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-23T06-36-13.280436.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- '**/details_harness|winogrande|5_2024-02-23T06-36-13.280436.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-23T06-36-13.280436.parquet'
- config_name: results
data_files:
- split: 2024_02_23T06_36_13.280436
path:
- results_2024-02-23T06-36-13.280436.parquet
- split: latest
path:
- results_2024-02-23T06-36-13.280436.parquet
---
# Dataset Card for Evaluation run of ChaoticNeutrals/This_is_fine_7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ChaoticNeutrals/This_is_fine_7B](https://huggingface.co/ChaoticNeutrals/This_is_fine_7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ChaoticNeutrals__This_is_fine_7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-23T06:36:13.280436](https://huggingface.co/datasets/open-llm-leaderboard/details_ChaoticNeutrals__This_is_fine_7B/blob/main/results_2024-02-23T06-36-13.280436.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6492573273816704,
"acc_stderr": 0.032258620394029484,
"acc_norm": 0.6499843683422565,
"acc_norm_stderr": 0.032916787491800534,
"mc1": 0.4944920440636475,
"mc1_stderr": 0.017502438990451067,
"mc2": 0.6578951945217267,
"mc2_stderr": 0.01514481956289198
},
"harness|arc:challenge|25": {
"acc": 0.681740614334471,
"acc_stderr": 0.013611993916971453,
"acc_norm": 0.7030716723549488,
"acc_norm_stderr": 0.013352025976725225
},
"harness|hellaswag|10": {
"acc": 0.7059350726946824,
"acc_stderr": 0.004546901132945115,
"acc_norm": 0.8728340967934675,
"acc_norm_stderr": 0.003324778429495356
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.03639057569952928,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.03639057569952928
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110175,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110175
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086923992,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086923992
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.022891687984554956,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.022891687984554956
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229865,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229865
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.02366129639396428,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.02366129639396428
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394848,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394848
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.02983796238829193,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.02983796238829193
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.034086558679777494,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.034086558679777494
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467617,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728743,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728743
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834834,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834834
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323374,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323374
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.49273743016759775,
"acc_stderr": 0.01672073740517951,
"acc_norm": 0.49273743016759775,
"acc_norm_stderr": 0.01672073740517951
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02526169121972948,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02526169121972948
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7427652733118971,
"acc_stderr": 0.024826171289250888,
"acc_norm": 0.7427652733118971,
"acc_norm_stderr": 0.024826171289250888
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.025089478523765137,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.025089478523765137
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46479791395045633,
"acc_stderr": 0.012738547371303956,
"acc_norm": 0.46479791395045633,
"acc_norm_stderr": 0.012738547371303956
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031208,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031208
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6421568627450981,
"acc_stderr": 0.019393058402355442,
"acc_norm": 0.6421568627450981,
"acc_norm_stderr": 0.019393058402355442
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644286,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.029162738410249772,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.029162738410249772
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197772,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197772
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061452,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061452
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4944920440636475,
"mc1_stderr": 0.017502438990451067,
"mc2": 0.6578951945217267,
"mc2_stderr": 0.01514481956289198
},
"harness|winogrande|5": {
"acc": 0.8161010260457774,
"acc_stderr": 0.010887916013305896
},
"harness|gsm8k|5": {
"acc": 0.6277482941622441,
"acc_stderr": 0.013315375362565038
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
useSword/Lora_Default | ---
license: apache-2.0
---
|
CyberHarem/indomitable_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of indomitable/インドミタブル/不挠 (Azur Lane)
This is the dataset of indomitable/インドミタブル/不挠 (Azur Lane), containing 428 images and their tags.
The core tags of this character are `long_hair, breasts, black_hair, very_long_hair, green_eyes, large_breasts, bangs, hair_between_eyes, huge_breasts, maid_headdress, hair_ornament, hair_flower`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 428 | 881.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/indomitable_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 428 | 396.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/indomitable_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1145 | 891.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/indomitable_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 428 | 730.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/indomitable_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1145 | 1.36 GiB | [Download](https://huggingface.co/datasets/CyberHarem/indomitable_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/indomitable_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 25 |  |  |  |  |  | 1girl, solo, white_dress, bare_shoulders, cleavage, looking_at_viewer, white_gloves, antlers, white_background, between_breasts, thighs, simple_background, sitting, white_flower, blush, elbow_gloves, navel |
| 1 | 5 |  |  |  |  |  | 1girl, antlers, bare_legs, cleavage, flower, full_body, high_heels, looking_at_viewer, solo, white_dress, white_footwear, white_gloves, simple_background, sitting, thighs, white_background, bare_shoulders, crossed_legs, elbow_gloves, horns |
| 2 | 8 |  |  |  |  |  | 1girl, black_skirt, cleavage, frilled_choker, looking_at_viewer, maid, miniskirt, official_alternate_costume, pleated_skirt, solo, white_pantyhose, between_breasts, sitting, blush, pillow, tongue_out |
| 3 | 5 |  |  |  |  |  | 1girl, black_skirt, blush, cleavage, looking_at_viewer, maid, official_alternate_costume, sitting, solo, white_background, white_pantyhose, simple_background, between_breasts, frilled_choker, pleated_skirt |
| 4 | 5 |  |  |  |  |  | 1boy, 1girl, hetero, official_alternate_costume, paizuri, penis, solo_focus, blush, breasts_squeezed_together, frilled_choker, breast_grab, cum_on_breasts, ejaculation, grabbing, heart-shaped_pupils, arm_garter, clothing_cutout, looking_at_viewer, maid, nipples, on_back, one_eye_closed, open_mouth, sidelocks, sweat |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | white_dress | bare_shoulders | cleavage | looking_at_viewer | white_gloves | antlers | white_background | between_breasts | thighs | simple_background | sitting | white_flower | blush | elbow_gloves | navel | bare_legs | flower | full_body | high_heels | white_footwear | crossed_legs | horns | black_skirt | frilled_choker | maid | miniskirt | official_alternate_costume | pleated_skirt | white_pantyhose | pillow | tongue_out | 1boy | hetero | paizuri | penis | solo_focus | breasts_squeezed_together | breast_grab | cum_on_breasts | ejaculation | grabbing | heart-shaped_pupils | arm_garter | clothing_cutout | nipples | on_back | one_eye_closed | open_mouth | sidelocks | sweat |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------|:-----------------|:-----------|:--------------------|:---------------|:----------|:-------------------|:------------------|:---------|:--------------------|:----------|:---------------|:--------|:---------------|:--------|:------------|:---------|:------------|:-------------|:-----------------|:---------------|:--------|:--------------|:-----------------|:-------|:------------|:-----------------------------|:----------------|:------------------|:---------|:-------------|:-------|:---------|:----------|:--------|:-------------|:----------------------------|:--------------|:-----------------|:--------------|:-----------|:----------------------|:-------------|:------------------|:----------|:----------|:-----------------|:-------------|:------------|:--------|
| 0 | 25 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | X | X | X | | | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | | | X | X | | | | X | | | X | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | | | X | X | | | X | X | | X | X | | X | | | | | | | | | | X | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | | | | X | | | | | | | | | X | | | | | | | | | | | X | X | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
ihassan1/auditor-sentiment | ---
annotations_creators:
- expert-generated
language: []
language_creators:
- expert-generated
license: []
multilinguality:
- monolingual
pretty_name: auditor-sentiment
size_categories: []
source_datasets: []
tags:
- auditor
- financial
- sentiment
- markets
task_categories:
- text-classification
task_ids:
- sentiment-scoring
---
# Dataset Card for Auditor Sentiment |
ruanchaves/rerelem_por_Latn_to_cat_Latn | ---
dataset_info:
features:
- name: docid
dtype: string
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: string
- name: same_text
dtype: bool
- name: __language__
dtype: string
splits:
- name: train
num_bytes: 1081392
num_examples: 2226
- name: validation
num_bytes: 363260
num_examples: 701
- name: test
num_bytes: 383612
num_examples: 805
download_size: 0
dataset_size: 1828264
---
# Dataset Card for "rerelem_por_Latn_to_cat_Latn"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mayankguptakiwi/Virtical | ---
license: apache-2.0
---
This is data for job type, shift type, modalities and procedures. |
ctu-aic/csfever | ---
license: cc-by-sa-3.0
---
# CsFEVER experimental Fact-Checking dataset
Czech dataset for fact verification localized from the data points of [FEVER](https://arxiv.org/abs/1803.05355) using the localization scheme described in the [CTKFacts: Czech Datasets for Fact Verification](https://arxiv.org/abs/2201.11115) paper which is currently being revised for publication in LREV journal.
The version you are looking at was reformatted to *Claim*-*Evidence* string pairs for the specific task of NLI - a more general Document-Retrieval-ready interpretation of our datapoints which can be used for training and evaluating the DR models over the June 2016 wikipedia snapshot can be found in the [data_dr]() folder in the JSON Lines format.
## Data Statement
### Curation Rationale
TODO
|
ilsilfverskiold/tech-keywords-topics-summary | ---
dataset_info:
features:
- name: id
dtype: string
- name: source
dtype: string
- name: text
dtype: string
- name: timestamp
dtype: string
- name: reactions
dtype: int64
- name: engagement
dtype: int64
- name: url
dtype: string
- name: text_length
dtype: int64
- name: keywords
dtype: string
- name: topic
dtype: string
- name: summary
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 3397963
num_examples: 7196
- name: validation
num_bytes: 298115
num_examples: 635
- name: test
num_bytes: 302271
num_examples: 635
download_size: 2438815
dataset_size: 3998349
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
memray/semeval | ---
license: cc-by-nc-sa-4.0
---
|
open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4_TEST | ---
pretty_name: Evaluation run of wei123602/Llama-2-13b-FINETUNE4_TEST
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [wei123602/Llama-2-13b-FINETUNE4_TEST](https://huggingface.co/wei123602/Llama-2-13b-FINETUNE4_TEST)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4_TEST\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T04:35:36.269188](https://huggingface.co/datasets/open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4_TEST/blob/main/results_2023-10-25T04-35-36.269188.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2558724832214765,\n\
\ \"em_stderr\": 0.004468637497676013,\n \"f1\": 0.29727348993288566,\n\
\ \"f1_stderr\": 0.0043971826108447475,\n \"acc\": 0.4511208594202994,\n\
\ \"acc_stderr\": 0.010571455427847876\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.2558724832214765,\n \"em_stderr\": 0.004468637497676013,\n\
\ \"f1\": 0.29727348993288566,\n \"f1_stderr\": 0.0043971826108447475\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13191811978771797,\n \
\ \"acc_stderr\": 0.009321265253857515\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7703235990528808,\n \"acc_stderr\": 0.011821645601838234\n\
\ }\n}\n```"
repo_url: https://huggingface.co/wei123602/Llama-2-13b-FINETUNE4_TEST
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|arc:challenge|25_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T04_35_36.269188
path:
- '**/details_harness|drop|3_2023-10-25T04-35-36.269188.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T04-35-36.269188.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T04_35_36.269188
path:
- '**/details_harness|gsm8k|5_2023-10-25T04-35-36.269188.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T04-35-36.269188.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hellaswag|10_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T23-17-56.003321.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-21T23-17-56.003321.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-21T23-17-56.003321.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T04_35_36.269188
path:
- '**/details_harness|winogrande|5_2023-10-25T04-35-36.269188.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T04-35-36.269188.parquet'
- config_name: results
data_files:
- split: 2023_09_21T23_17_56.003321
path:
- results_2023-09-21T23-17-56.003321.parquet
- split: 2023_10_25T04_35_36.269188
path:
- results_2023-10-25T04-35-36.269188.parquet
- split: latest
path:
- results_2023-10-25T04-35-36.269188.parquet
---
# Dataset Card for Evaluation run of wei123602/Llama-2-13b-FINETUNE4_TEST
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/wei123602/Llama-2-13b-FINETUNE4_TEST
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [wei123602/Llama-2-13b-FINETUNE4_TEST](https://huggingface.co/wei123602/Llama-2-13b-FINETUNE4_TEST) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4_TEST",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T04:35:36.269188](https://huggingface.co/datasets/open-llm-leaderboard/details_wei123602__Llama-2-13b-FINETUNE4_TEST/blob/main/results_2023-10-25T04-35-36.269188.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.2558724832214765,
"em_stderr": 0.004468637497676013,
"f1": 0.29727348993288566,
"f1_stderr": 0.0043971826108447475,
"acc": 0.4511208594202994,
"acc_stderr": 0.010571455427847876
},
"harness|drop|3": {
"em": 0.2558724832214765,
"em_stderr": 0.004468637497676013,
"f1": 0.29727348993288566,
"f1_stderr": 0.0043971826108447475
},
"harness|gsm8k|5": {
"acc": 0.13191811978771797,
"acc_stderr": 0.009321265253857515
},
"harness|winogrande|5": {
"acc": 0.7703235990528808,
"acc_stderr": 0.011821645601838234
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
zolak/twitter_dataset_81_1713076277 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2557160
num_examples: 6304
download_size: 1275554
dataset_size: 2557160
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_yam-peleg__Experiment20-7B | ---
pretty_name: Evaluation run of yam-peleg/Experiment20-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yam-peleg/Experiment20-7B](https://huggingface.co/yam-peleg/Experiment20-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yam-peleg__Experiment20-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-20T04:27:28.761237](https://huggingface.co/datasets/open-llm-leaderboard/details_yam-peleg__Experiment20-7B/blob/main/results_2024-02-20T04-27-28.761237.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6388395808326348,\n\
\ \"acc_stderr\": 0.03248110016666022,\n \"acc_norm\": 0.6382917347789664,\n\
\ \"acc_norm_stderr\": 0.03316218579189402,\n \"mc1\": 0.6083231334149327,\n\
\ \"mc1_stderr\": 0.017087795881769646,\n \"mc2\": 0.7771507307486577,\n\
\ \"mc2_stderr\": 0.013765185430621489\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7030716723549488,\n \"acc_stderr\": 0.013352025976725223,\n\
\ \"acc_norm\": 0.7303754266211604,\n \"acc_norm_stderr\": 0.012968040686869148\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7070304720175263,\n\
\ \"acc_stderr\": 0.004541944342035901,\n \"acc_norm\": 0.8861780521808404,\n\
\ \"acc_norm_stderr\": 0.0031694581233577238\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\
\ \"acc_stderr\": 0.046306532033665956,\n \"acc_norm\": 0.41228070175438597,\n\
\ \"acc_norm_stderr\": 0.046306532033665956\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.02519710107424649,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.02519710107424649\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7580645161290323,\n\
\ \"acc_stderr\": 0.02436259969303109,\n \"acc_norm\": 0.7580645161290323,\n\
\ \"acc_norm_stderr\": 0.02436259969303109\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.032250781083062896,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.032250781083062896\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \
\ \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815632,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815632\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.03077805742293167,\n \
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.03077805742293167\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.015848255806501534,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.015848255806501534\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.47685185185185186,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8326947637292464,\n\
\ \"acc_stderr\": 0.013347327202920332,\n \"acc_norm\": 0.8326947637292464,\n\
\ \"acc_norm_stderr\": 0.013347327202920332\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.02454761779480383,\n\
\ \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.02454761779480383\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43687150837988825,\n\
\ \"acc_stderr\": 0.016588680864530626,\n \"acc_norm\": 0.43687150837988825,\n\
\ \"acc_norm_stderr\": 0.016588680864530626\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279056,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279056\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.024922001168886335,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.024922001168886335\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4661016949152542,\n\
\ \"acc_stderr\": 0.01274085387294983,\n \"acc_norm\": 0.4661016949152542,\n\
\ \"acc_norm_stderr\": 0.01274085387294983\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.0189754279205072,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.0189754279205072\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304328,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304328\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6083231334149327,\n\
\ \"mc1_stderr\": 0.017087795881769646,\n \"mc2\": 0.7771507307486577,\n\
\ \"mc2_stderr\": 0.013765185430621489\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8500394632991318,\n \"acc_stderr\": 0.010034394804580809\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6664139499620925,\n \
\ \"acc_stderr\": 0.012987282131410812\n }\n}\n```"
repo_url: https://huggingface.co/yam-peleg/Experiment20-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|arc:challenge|25_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|gsm8k|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hellaswag|10_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T04-27-28.761237.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-20T04-27-28.761237.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- '**/details_harness|winogrande|5_2024-02-20T04-27-28.761237.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-20T04-27-28.761237.parquet'
- config_name: results
data_files:
- split: 2024_02_20T04_27_28.761237
path:
- results_2024-02-20T04-27-28.761237.parquet
- split: latest
path:
- results_2024-02-20T04-27-28.761237.parquet
---
# Dataset Card for Evaluation run of yam-peleg/Experiment20-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yam-peleg/Experiment20-7B](https://huggingface.co/yam-peleg/Experiment20-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yam-peleg__Experiment20-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-20T04:27:28.761237](https://huggingface.co/datasets/open-llm-leaderboard/details_yam-peleg__Experiment20-7B/blob/main/results_2024-02-20T04-27-28.761237.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6388395808326348,
"acc_stderr": 0.03248110016666022,
"acc_norm": 0.6382917347789664,
"acc_norm_stderr": 0.03316218579189402,
"mc1": 0.6083231334149327,
"mc1_stderr": 0.017087795881769646,
"mc2": 0.7771507307486577,
"mc2_stderr": 0.013765185430621489
},
"harness|arc:challenge|25": {
"acc": 0.7030716723549488,
"acc_stderr": 0.013352025976725223,
"acc_norm": 0.7303754266211604,
"acc_norm_stderr": 0.012968040686869148
},
"harness|hellaswag|10": {
"acc": 0.7070304720175263,
"acc_stderr": 0.004541944342035901,
"acc_norm": 0.8861780521808404,
"acc_norm_stderr": 0.0031694581233577238
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.046306532033665956,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.046306532033665956
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.02519710107424649,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.02519710107424649
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7580645161290323,
"acc_stderr": 0.02436259969303109,
"acc_norm": 0.7580645161290323,
"acc_norm_stderr": 0.02436259969303109
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.541871921182266,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.541871921182266,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.032250781083062896,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.032250781083062896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.02416278028401772,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.02416278028401772
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815632,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815632
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.03077805742293167,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.03077805742293167
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.015848255806501534,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.015848255806501534
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8326947637292464,
"acc_stderr": 0.013347327202920332,
"acc_norm": 0.8326947637292464,
"acc_norm_stderr": 0.013347327202920332
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.02454761779480383,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.02454761779480383
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43687150837988825,
"acc_stderr": 0.016588680864530626,
"acc_norm": 0.43687150837988825,
"acc_norm_stderr": 0.016588680864530626
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279056,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279056
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.024922001168886335,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.024922001168886335
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4661016949152542,
"acc_stderr": 0.01274085387294983,
"acc_norm": 0.4661016949152542,
"acc_norm_stderr": 0.01274085387294983
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.0189754279205072,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.0189754279205072
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304328,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304328
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6083231334149327,
"mc1_stderr": 0.017087795881769646,
"mc2": 0.7771507307486577,
"mc2_stderr": 0.013765185430621489
},
"harness|winogrande|5": {
"acc": 0.8500394632991318,
"acc_stderr": 0.010034394804580809
},
"harness|gsm8k|5": {
"acc": 0.6664139499620925,
"acc_stderr": 0.012987282131410812
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
kpriyanshu256/MultiTabQA-multitable_pretraining-train-v2-20000 | ---
dataset_info:
features:
- name: tables
sequence: string
- name: table_names
sequence: string
- name: query
dtype: string
- name: answer
dtype: string
- name: source
dtype: string
- name: target
dtype: string
- name: source_latex
dtype: string
- name: target_latex
dtype: string
- name: source_html
dtype: string
- name: target_html
dtype: string
- name: source_markdown
dtype: string
- name: target_markdown
dtype: string
splits:
- name: train
num_bytes: 15728318288
num_examples: 2500
download_size: 3090466943
dataset_size: 15728318288
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
shidowake/FreedomIntelligence_alpaca-gpt4-japanese_subset_split_4 | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 4863217.322740098
num_examples: 4997
download_size: 2556447
dataset_size: 4863217.322740098
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ukr-models/Ukr-Synth | ---
annotations_creators:
- machine-generated
language_creators:
- found
language:
- uk
license:
- mit
multilinguality:
- monolingual
size_categories:
- 1M<n<10M
task_categories:
- token-classification
task_ids:
- named-entity-recognition
- parsing
- part-of-speech
pretty_name: Ukrainian synthetic dataset in conllu format
---
# Dataset Card for Ukr-Synth
## Dataset Description
### Dataset Summary
Large silver standard Ukrainian corpus annotated with morphology tags, syntax trees and PER, LOC, ORG NER-tags.
Represents a subsample of [Leipzig Corpora Collection for Ukrainian Language](https://wortschatz.uni-leipzig.de/en/download/Ukrainian). The source texts are newspaper texts split into sentences and shuffled. The sentrences are annotated using transformer-based models trained using gold standard Ukrainian language datasets.
### Languages
Ukrainian
## Dataset Structure
### Data Splits
| name |train |validation|
|---------|-------:|---------:|
|conll2003|1000000| 10000|
## Dataset Creation
### Source Data
Leipzig Corpora Collection:
D. Goldhahn, T. Eckart & U. Quasthoff: Building Large Monolingual Dictionaries at the Leipzig Corpora Collection: From 100 to 200 Languages.
In: Proceedings of the 8th International Language Resources and Evaluation (LREC'12), 2012
## Additional Information
### Licensing Information
MIT License
Copyright (c) 2022
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE. |
EdinburghNLP/xsum | ---
annotations_creators:
- found
language_creators:
- found
language:
- en
license:
- unknown
multilinguality:
- monolingual
pretty_name: Extreme Summarization (XSum)
paperswithcode_id: xsum
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- summarization
task_ids:
- news-articles-summarization
train-eval-index:
- config: default
task: summarization
task_id: summarization
splits:
train_split: train
eval_split: test
col_mapping:
document: text
summary: target
metrics:
- type: rouge
name: Rouge
dataset_info:
features:
- name: document
dtype: string
- name: summary
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 479206608
num_examples: 204045
- name: validation
num_bytes: 26292901
num_examples: 11332
- name: test
num_bytes: 26756165
num_examples: 11334
download_size: 257302866
dataset_size: 532255674
---
# Dataset Card for "xsum"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:** https://github.com/EdinburghNLP/XSum
- **Paper:** [Don't Give Me the Details, Just the Summary! Topic-Aware Convolutional Neural Networks for Extreme Summarization](https://arxiv.org/abs/1808.08745)
- **Point of Contact:** [Shashi Narayan](mailto:shashi.narayan@ed.ac.uk)
- **Size of downloaded dataset files:** 257.30 MB
- **Size of the generated dataset:** 532.26 MB
- **Total amount of disk used:** 789.56 MB
### Dataset Summary
Extreme Summarization (XSum) Dataset.
There are three features:
- document: Input news article.
- summary: One sentence summary of the article.
- id: BBC ID of the article.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### default
- **Size of downloaded dataset files:** 257.30 MB
- **Size of the generated dataset:** 532.26 MB
- **Total amount of disk used:** 789.56 MB
An example of 'validation' looks as follows.
```
{
"document": "some-body",
"id": "29750031",
"summary": "some-sentence"
}
```
### Data Fields
The data fields are the same among all splits.
#### default
- `document`: a `string` feature.
- `summary`: a `string` feature.
- `id`: a `string` feature.
### Data Splits
| name |train |validation|test |
|-------|-----:|---------:|----:|
|default|204045| 11332|11334|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@article{Narayan2018DontGM,
title={Don't Give Me the Details, Just the Summary! Topic-Aware Convolutional Neural Networks for Extreme Summarization},
author={Shashi Narayan and Shay B. Cohen and Mirella Lapata},
journal={ArXiv},
year={2018},
volume={abs/1808.08745}
}
```
### Contributions
Thanks to [@thomwolf](https://github.com/thomwolf), [@lewtun](https://github.com/lewtun), [@mariamabarham](https://github.com/mariamabarham), [@jbragg](https://github.com/jbragg), [@lhoestq](https://github.com/lhoestq), [@patrickvonplaten](https://github.com/patrickvonplaten) for adding this dataset. |
galman33/gal_yair_8300_100x100 | ---
dataset_info:
features:
- name: lat
dtype: float64
- name: lon
dtype: float64
- name: country_code
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 142004157.0
num_examples: 8300
download_size: 141994031
dataset_size: 142004157.0
---
# Dataset Card for "yair_gal_small_resized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jegormeister/dutch-snli | ---
language:
- nl
---
This is a translated version of SNLI in Dutch. The translation was performed using Google Translate. |
felanders/preprocessed | ---
license: mit
dataset_info:
features:
- name: report_id
dtype: string
- name: paragraph_nr
dtype: int64
- name: text
dtype: string
- name: n_words
dtype: int64
- name: filing_type
dtype: string
splits:
- name: evaluate
num_bytes: 838148369
num_examples: 1825821
- name: zero_shot
num_bytes: 45860634
num_examples: 100000
- name: active_learning
num_bytes: 45759361
num_examples: 100000
download_size: 491384621
dataset_size: 929768364
---
|
jonfd/ICC | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- is
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 100M<n<1B
source_datasets:
- original
task_categories:
- text-generation
task_ids:
- language-modeling
pretty_name: ICC
---
# Dataset Card for ICC
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Point of Contact:** [Jón Friðrik Daðason](mailto:jond19@ru.is)
### Dataset Summary
The Icelandic Crawled Corpus (ICC) contains approximately 930M tokens which have been scraped from a selection of Icelandic websites, including news sites, government websites and forums. The scraped text is presented in its original form, unannotated, untokenized and without deduplication.
### Supported Tasks and Leaderboards
The ICC is primarily intended for use in training language models. It can be combined with other corpora, such as the [Icelandic Gigaword Corpus](http://igc.arnastofnun.is/) and the Icelandic portion of the [mC4](https://huggingface.co/datasets/mc4) corpus.
### Languages
This corpus contains text in Icelandic, scraped from a variety of online sources.
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
Each scraped item consists of two fields:
* **url**: The source URL of the scraped text.
* **text**: The scraped text.
### Data Splits
N/A
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
N/A
#### Who are the annotators?
N/A
### Personal and Sensitive Information
Although this corpus consists entirely of text collected from publicly available websites, it may contain some examples of personal or sensitive information.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
This corpus was created by Jón Friðrik Daðason, during work done at the [Language and Voice Lab](https://lvl.ru.is/) at [Reykjavik University](https://www.ru.is/).
This project was funded by the Language Technology Programme for Icelandic 2019-2023. The programme, which is managed and coordinated by [Almannarómur](https://almannaromur.is/), is funded by the Icelandic Ministry of Education, Science and Culture.
### Licensing Information
This work is licensed under a Creative Commons Attribution 4.0
International License. Any text, HTML page links, information, metadata or
other materials in this work may be subject to separate terms and
conditions between you and the owners of such content.
If you are a copyright owner or an agent thereof and believe that any
content in this work infringes upon your copyrights, you may submit a
notification with the following information:
* Your full name and information reasonably sufficient to permit us to
contact you, such as mailing address, phone number and an email address.
* Identification of the copyrighted work you claim has been infringed.
* Identification of the material you claim is infringing and should be
removed, and information reasonably sufficient to permit us to locate
the material.
### Citation Information
N/A
### Contributions
Thanks to [@jonfd](https://github.com/jonfd) for adding this dataset.
|
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE2_TEST_2.2w | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE2_TEST_2.2w
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE2_TEST_2.2w](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE2_TEST_2.2w)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE2_TEST_2.2w\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T15:56:51.054424](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE2_TEST_2.2w/blob/main/results_2023-09-22T15-56-51.054424.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.13653523489932887,\n\
\ \"em_stderr\": 0.0035162871401896623,\n \"f1\": 0.18752202181207997,\n\
\ \"f1_stderr\": 0.0035554972989016802,\n \"acc\": 0.4267978240433516,\n\
\ \"acc_stderr\": 0.009809122705480169\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.13653523489932887,\n \"em_stderr\": 0.0035162871401896623,\n\
\ \"f1\": 0.18752202181207997,\n \"f1_stderr\": 0.0035554972989016802\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08642911296436695,\n \
\ \"acc_stderr\": 0.007740044337103793\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7671665351223362,\n \"acc_stderr\": 0.011878201073856544\n\
\ }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE2_TEST_2.2w
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|arc:challenge|25_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T15_56_51.054424
path:
- '**/details_harness|drop|3_2023-09-22T15-56-51.054424.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T15-56-51.054424.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T15_56_51.054424
path:
- '**/details_harness|gsm8k|5_2023-09-22T15-56-51.054424.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T15-56-51.054424.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hellaswag|10_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T06:14:16.488025.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-04T06:14:16.488025.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-04T06:14:16.488025.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T15_56_51.054424
path:
- '**/details_harness|winogrande|5_2023-09-22T15-56-51.054424.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T15-56-51.054424.parquet'
- config_name: results
data_files:
- split: 2023_09_04T06_14_16.488025
path:
- results_2023-09-04T06:14:16.488025.parquet
- split: 2023_09_22T15_56_51.054424
path:
- results_2023-09-22T15-56-51.054424.parquet
- split: latest
path:
- results_2023-09-22T15-56-51.054424.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE2_TEST_2.2w
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE2_TEST_2.2w
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE2_TEST_2.2w](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE2_TEST_2.2w) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE2_TEST_2.2w",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T15:56:51.054424](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE2_TEST_2.2w/blob/main/results_2023-09-22T15-56-51.054424.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.13653523489932887,
"em_stderr": 0.0035162871401896623,
"f1": 0.18752202181207997,
"f1_stderr": 0.0035554972989016802,
"acc": 0.4267978240433516,
"acc_stderr": 0.009809122705480169
},
"harness|drop|3": {
"em": 0.13653523489932887,
"em_stderr": 0.0035162871401896623,
"f1": 0.18752202181207997,
"f1_stderr": 0.0035554972989016802
},
"harness|gsm8k|5": {
"acc": 0.08642911296436695,
"acc_stderr": 0.007740044337103793
},
"harness|winogrande|5": {
"acc": 0.7671665351223362,
"acc_stderr": 0.011878201073856544
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
phantue/abcdads | ---
license: mit
---
|
mtkinit/dsae | ---
pretty_name: dsae
tags:
- esa
---
# dsae
Created from AIOD platform |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.