datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
emaeon/train2 | ---
dataset_info:
features:
- name: code1
dtype: string
- name: code2
dtype: string
- name: similar
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 9011675974
num_examples: 5000000
download_size: 0
dataset_size: 9011675974
---
# Dataset Card for "train2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wanadzhar913/crawl-timchew | ---
license: apache-2.0
language:
- en
---
TLDR
- website: [timchew](https://timchew.net/)
- num. of webpages scraped: 839
- link to dataset: https://huggingface.co/datasets/wanadzhar913/crawl-timchew
- last date of scraping: 10th September 2023
- status: complete
- pull request: https://github.com/huseinzol05/malaysian-dataset/pull/313
- contributed to: https://github.com/huseinzol05/malaysian-dataset |
CyberHarem/larva_tiamat_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of larva_tiamat/ラーヴァ/ティアマト/幼体/提亚马特 (Fate/Grand Order)
This is the dataset of larva_tiamat/ラーヴァ/ティアマト/幼体/提亚马特 (Fate/Grand Order), containing 208 images and their tags.
The core tags of this character are `horns, curled_horns, symbol-shaped_pupils, pink_eyes, pointy_ears, braid, breasts, long_hair, blue_hair, aqua_hair, hair_between_eyes, very_long_hair, x-shaped_pupils, small_breasts, large_breasts, crown_braid, tail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 208 | 407.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/larva_tiamat_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 208 | 348.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/larva_tiamat_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 517 | 662.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/larva_tiamat_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/larva_tiamat_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, choker, long_sleeves, pendant, ribbed_dress, solo, white_dress, thighs, white_thighhighs, looking_at_viewer, sleeves_past_fingers, open_mouth |
| 1 | 5 |  |  |  |  |  | 1girl, choker, long_sleeves, pendant, ribbed_dress, solo, white_dress, +_+, looking_at_viewer, open_mouth, red_eyes, collarbone |
| 2 | 7 |  |  |  |  |  | 1girl, choker, pendant, ribbed_dress, solo, white_dress, green_horns, long_sleeves, looking_at_viewer, collarbone, open_mouth |
| 3 | 21 |  |  |  |  |  | 1girl, looking_at_viewer, solo, long_sleeves, sleeves_past_fingers, black_dress, black_scarf, ribbed_dress, white_pantyhose, blue_scarf |
| 4 | 12 |  |  |  |  |  | 1girl, looking_at_viewer, navel, solo, cleavage, stomach_tattoo, green_horns, +_+, jewelry, pubic_tattoo, blue_nails, smile, thighs |
| 5 | 8 |  |  |  |  |  | 1boy, 1girl, blush, hetero, nipples, solo_focus, collarbone, completely_nude, open_mouth, pendant, penis, sweat, choker, looking_at_viewer, navel, sex, thighs, girl_on_top, mosaic_censoring, pussy, sidelocks, vaginal, cowgirl_position, cum, smile, spread_legs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | choker | long_sleeves | pendant | ribbed_dress | solo | white_dress | thighs | white_thighhighs | looking_at_viewer | sleeves_past_fingers | open_mouth | +_+ | red_eyes | collarbone | green_horns | black_dress | black_scarf | white_pantyhose | blue_scarf | navel | cleavage | stomach_tattoo | jewelry | pubic_tattoo | blue_nails | smile | 1boy | blush | hetero | nipples | solo_focus | completely_nude | penis | sweat | sex | girl_on_top | mosaic_censoring | pussy | sidelocks | vaginal | cowgirl_position | cum | spread_legs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:---------------|:----------|:---------------|:-------|:--------------|:---------|:-------------------|:--------------------|:-----------------------|:-------------|:------|:-----------|:-------------|:--------------|:--------------|:--------------|:------------------|:-------------|:--------|:-----------|:-----------------|:----------|:---------------|:-------------|:--------|:-------|:--------|:---------|:----------|:-------------|:------------------|:--------|:--------|:------|:--------------|:-------------------|:--------|:------------|:----------|:-------------------|:------|:--------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | | | X | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 21 |  |  |  |  |  | X | | X | | X | X | | | | X | X | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 12 |  |  |  |  |  | X | | | | | X | | X | | X | | | X | | | X | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | X | | X | | | | X | | X | | X | | | X | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
AdapterOcean/med_alpaca_standardized_cluster_77_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 12046782
num_examples: 22998
download_size: 6358389
dataset_size: 12046782
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_77_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zjguoHF/tokenized_tiny_shakespeare_dataset | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: special_tokens_mask
sequence: int8
splits:
- name: train
num_bytes: 2638965
num_examples: 32592
- name: validation
num_bytes: 223829
num_examples: 2795
- name: test
num_bytes: 358179
num_examples: 4611
download_size: 988180
dataset_size: 3220973
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
heliosprime/twitter_dataset_1713045897 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 13059
num_examples: 28
download_size: 9457
dataset_size: 13059
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713045897"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JamesNetflix/clothing-256 | ---
dataset_info:
features:
- name: split
dtype: string
- name: label
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 510209201.5
num_examples: 5108
download_size: 510152513
dataset_size: 510209201.5
---
# Dataset Card for "clothing-256"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
khoomeik/gzipscale-0.35-100M | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 322419387
num_examples: 390625
download_size: 84788817
dataset_size: 322419387
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
qayqaq/github-issues | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: labels
list:
- name: id
dtype: int64
- name: node_id
dtype: string
- name: url
dtype: string
- name: name
dtype: string
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: assignees
list:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: comments
sequence: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: closed_at
dtype: timestamp[s]
- name: author_association
dtype: string
- name: body
dtype: string
- name: reactions
struct:
- name: url
dtype: string
- name: total_count
dtype: int64
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: laugh
dtype: int64
- name: hooray
dtype: int64
- name: confused
dtype: int64
- name: heart
dtype: int64
- name: rocket
dtype: int64
- name: eyes
dtype: int64
- name: timeline_url
dtype: string
- name: state_reason
dtype: string
- name: draft
dtype: bool
- name: pull_request
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: diff_url
dtype: string
- name: patch_url
dtype: string
- name: merged_at
dtype: timestamp[s]
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 11732758
num_examples: 1000
download_size: 3212220
dataset_size: 11732758
---
# Dataset Card for "github-issues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ttaront/iz_mc4_jp | ---
language:
- ja
--- |
adrianex00/Gimpson2016 | ---
license: openrail
---
|
Ranjit/tie_data | ---
dataset_info:
features:
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 2424512295.4
num_examples: 7222
- name: test
num_bytes: 593034901.192
num_examples: 1806
download_size: 2818485325
dataset_size: 3017547196.592
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_Aryanne__sheared-silicon10p | ---
pretty_name: Evaluation run of Aryanne/sheared-silicon10p
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Aryanne/sheared-silicon10p](https://huggingface.co/Aryanne/sheared-silicon10p)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aryanne__sheared-silicon10p\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-24T01:36:29.411153](https://huggingface.co/datasets/open-llm-leaderboard/details_Aryanne__sheared-silicon10p/blob/main/results_2024-01-24T01-36-29.411153.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2599470421169613,\n\
\ \"acc_stderr\": 0.030910228069377786,\n \"acc_norm\": 0.26177545641423533,\n\
\ \"acc_norm_stderr\": 0.03173012245284717,\n \"mc1\": 0.2802937576499388,\n\
\ \"mc1_stderr\": 0.015723139524608774,\n \"mc2\": 0.44852499686717784,\n\
\ \"mc2_stderr\": 0.015613508500309613\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.32764505119453924,\n \"acc_stderr\": 0.013715847940719342,\n\
\ \"acc_norm\": 0.36177474402730375,\n \"acc_norm_stderr\": 0.014041957945038073\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3850826528579964,\n\
\ \"acc_stderr\": 0.004856203374715455,\n \"acc_norm\": 0.5111531567416849,\n\
\ \"acc_norm_stderr\": 0.004988539870174639\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3223684210526316,\n \"acc_stderr\": 0.03803510248351586,\n\
\ \"acc_norm\": 0.3223684210526316,\n \"acc_norm_stderr\": 0.03803510248351586\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.23018867924528302,\n \"acc_stderr\": 0.025907897122408173,\n\
\ \"acc_norm\": 0.23018867924528302,\n \"acc_norm_stderr\": 0.025907897122408173\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2658959537572254,\n\
\ \"acc_stderr\": 0.03368762932259431,\n \"acc_norm\": 0.2658959537572254,\n\
\ \"acc_norm_stderr\": 0.03368762932259431\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2127659574468085,\n \"acc_stderr\": 0.026754391348039776,\n\
\ \"acc_norm\": 0.2127659574468085,\n \"acc_norm_stderr\": 0.026754391348039776\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.036951833116502325,\n\
\ \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.036951833116502325\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2698412698412698,\n \"acc_stderr\": 0.022860838309232072,\n \"\
acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.022860838309232072\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15873015873015872,\n\
\ \"acc_stderr\": 0.03268454013011743,\n \"acc_norm\": 0.15873015873015872,\n\
\ \"acc_norm_stderr\": 0.03268454013011743\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.22580645161290322,\n\
\ \"acc_stderr\": 0.023785577884181015,\n \"acc_norm\": 0.22580645161290322,\n\
\ \"acc_norm_stderr\": 0.023785577884181015\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03010833071801162,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03010833071801162\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2787878787878788,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.2787878787878788,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2222222222222222,\n \"acc_stderr\": 0.029620227874790486,\n \"\
acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.029620227874790486\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.26424870466321243,\n \"acc_stderr\": 0.03182155050916648,\n\
\ \"acc_norm\": 0.26424870466321243,\n \"acc_norm_stderr\": 0.03182155050916648\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2743589743589744,\n \"acc_stderr\": 0.0226227657674932,\n \
\ \"acc_norm\": 0.2743589743589744,\n \"acc_norm_stderr\": 0.0226227657674932\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275788,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275788\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2184873949579832,\n \"acc_stderr\": 0.026841514322958955,\n\
\ \"acc_norm\": 0.2184873949579832,\n \"acc_norm_stderr\": 0.026841514322958955\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.22568807339449543,\n\
\ \"acc_stderr\": 0.01792308766780306,\n \"acc_norm\": 0.22568807339449543,\n\
\ \"acc_norm_stderr\": 0.01792308766780306\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02988691054762696,\n\
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02988691054762696\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2647058823529412,\n \"acc_stderr\": 0.03096451792692341,\n \"\
acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.03096451792692341\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.27848101265822783,\n \"acc_stderr\": 0.02917868230484256,\n \
\ \"acc_norm\": 0.27848101265822783,\n \"acc_norm_stderr\": 0.02917868230484256\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.18834080717488788,\n\
\ \"acc_stderr\": 0.026241132996407266,\n \"acc_norm\": 0.18834080717488788,\n\
\ \"acc_norm_stderr\": 0.026241132996407266\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728742,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728742\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.38016528925619836,\n \"acc_stderr\": 0.04431324501968432,\n \"\
acc_norm\": 0.38016528925619836,\n \"acc_norm_stderr\": 0.04431324501968432\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3067484662576687,\n \"acc_stderr\": 0.036230899157241474,\n\
\ \"acc_norm\": 0.3067484662576687,\n \"acc_norm_stderr\": 0.036230899157241474\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03894641120044792,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03894641120044792\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.26495726495726496,\n\
\ \"acc_stderr\": 0.028911208802749475,\n \"acc_norm\": 0.26495726495726496,\n\
\ \"acc_norm_stderr\": 0.028911208802749475\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2707535121328225,\n\
\ \"acc_stderr\": 0.015889888362560486,\n \"acc_norm\": 0.2707535121328225,\n\
\ \"acc_norm_stderr\": 0.015889888362560486\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.29190751445086704,\n \"acc_stderr\": 0.02447699407624734,\n\
\ \"acc_norm\": 0.29190751445086704,\n \"acc_norm_stderr\": 0.02447699407624734\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24581005586592178,\n\
\ \"acc_stderr\": 0.014400296429225586,\n \"acc_norm\": 0.24581005586592178,\n\
\ \"acc_norm_stderr\": 0.014400296429225586\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.02473998135511359,\n\
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.02473998135511359\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2829581993569132,\n\
\ \"acc_stderr\": 0.02558306248998484,\n \"acc_norm\": 0.2829581993569132,\n\
\ \"acc_norm_stderr\": 0.02558306248998484\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.025407197798890155,\n\
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.025407197798890155\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590638,\n \
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590638\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2685788787483703,\n\
\ \"acc_stderr\": 0.011320056629121734,\n \"acc_norm\": 0.2685788787483703,\n\
\ \"acc_norm_stderr\": 0.011320056629121734\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.19117647058823528,\n \"acc_stderr\": 0.023886881922440362,\n\
\ \"acc_norm\": 0.19117647058823528,\n \"acc_norm_stderr\": 0.023886881922440362\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2761437908496732,\n \"acc_stderr\": 0.018087276935663137,\n \
\ \"acc_norm\": 0.2761437908496732,\n \"acc_norm_stderr\": 0.018087276935663137\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n\
\ \"acc_stderr\": 0.04013964554072775,\n \"acc_norm\": 0.22727272727272727,\n\
\ \"acc_norm_stderr\": 0.04013964554072775\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24489795918367346,\n \"acc_stderr\": 0.02752963744017492,\n\
\ \"acc_norm\": 0.24489795918367346,\n \"acc_norm_stderr\": 0.02752963744017492\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.19879518072289157,\n\
\ \"acc_stderr\": 0.03106939026078942,\n \"acc_norm\": 0.19879518072289157,\n\
\ \"acc_norm_stderr\": 0.03106939026078942\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.28654970760233917,\n \"acc_stderr\": 0.034678266857038266,\n\
\ \"acc_norm\": 0.28654970760233917,\n \"acc_norm_stderr\": 0.034678266857038266\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2802937576499388,\n\
\ \"mc1_stderr\": 0.015723139524608774,\n \"mc2\": 0.44852499686717784,\n\
\ \"mc2_stderr\": 0.015613508500309613\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5722178374112076,\n \"acc_stderr\": 0.013905134013839955\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Aryanne/sheared-silicon10p
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|arc:challenge|25_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|gsm8k|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hellaswag|10_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T01-36-29.411153.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-24T01-36-29.411153.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- '**/details_harness|winogrande|5_2024-01-24T01-36-29.411153.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-24T01-36-29.411153.parquet'
- config_name: results
data_files:
- split: 2024_01_24T01_36_29.411153
path:
- results_2024-01-24T01-36-29.411153.parquet
- split: latest
path:
- results_2024-01-24T01-36-29.411153.parquet
---
# Dataset Card for Evaluation run of Aryanne/sheared-silicon10p
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Aryanne/sheared-silicon10p](https://huggingface.co/Aryanne/sheared-silicon10p) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Aryanne__sheared-silicon10p",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-24T01:36:29.411153](https://huggingface.co/datasets/open-llm-leaderboard/details_Aryanne__sheared-silicon10p/blob/main/results_2024-01-24T01-36-29.411153.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2599470421169613,
"acc_stderr": 0.030910228069377786,
"acc_norm": 0.26177545641423533,
"acc_norm_stderr": 0.03173012245284717,
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608774,
"mc2": 0.44852499686717784,
"mc2_stderr": 0.015613508500309613
},
"harness|arc:challenge|25": {
"acc": 0.32764505119453924,
"acc_stderr": 0.013715847940719342,
"acc_norm": 0.36177474402730375,
"acc_norm_stderr": 0.014041957945038073
},
"harness|hellaswag|10": {
"acc": 0.3850826528579964,
"acc_stderr": 0.004856203374715455,
"acc_norm": 0.5111531567416849,
"acc_norm_stderr": 0.004988539870174639
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3223684210526316,
"acc_stderr": 0.03803510248351586,
"acc_norm": 0.3223684210526316,
"acc_norm_stderr": 0.03803510248351586
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.23018867924528302,
"acc_stderr": 0.025907897122408173,
"acc_norm": 0.23018867924528302,
"acc_norm_stderr": 0.025907897122408173
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2658959537572254,
"acc_stderr": 0.03368762932259431,
"acc_norm": 0.2658959537572254,
"acc_norm_stderr": 0.03368762932259431
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2127659574468085,
"acc_stderr": 0.026754391348039776,
"acc_norm": 0.2127659574468085,
"acc_norm_stderr": 0.026754391348039776
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.036951833116502325,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.036951833116502325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.022860838309232072,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.022860838309232072
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15873015873015872,
"acc_stderr": 0.03268454013011743,
"acc_norm": 0.15873015873015872,
"acc_norm_stderr": 0.03268454013011743
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22580645161290322,
"acc_stderr": 0.023785577884181015,
"acc_norm": 0.22580645161290322,
"acc_norm_stderr": 0.023785577884181015
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03010833071801162,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03010833071801162
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2787878787878788,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.2787878787878788,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.26424870466321243,
"acc_stderr": 0.03182155050916648,
"acc_norm": 0.26424870466321243,
"acc_norm_stderr": 0.03182155050916648
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2743589743589744,
"acc_stderr": 0.0226227657674932,
"acc_norm": 0.2743589743589744,
"acc_norm_stderr": 0.0226227657674932
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275788,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275788
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2184873949579832,
"acc_stderr": 0.026841514322958955,
"acc_norm": 0.2184873949579832,
"acc_norm_stderr": 0.026841514322958955
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22568807339449543,
"acc_stderr": 0.01792308766780306,
"acc_norm": 0.22568807339449543,
"acc_norm_stderr": 0.01792308766780306
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02988691054762696,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02988691054762696
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.03096451792692341,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.03096451792692341
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.27848101265822783,
"acc_stderr": 0.02917868230484256,
"acc_norm": 0.27848101265822783,
"acc_norm_stderr": 0.02917868230484256
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.18834080717488788,
"acc_stderr": 0.026241132996407266,
"acc_norm": 0.18834080717488788,
"acc_norm_stderr": 0.026241132996407266
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728742,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728742
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.38016528925619836,
"acc_stderr": 0.04431324501968432,
"acc_norm": 0.38016528925619836,
"acc_norm_stderr": 0.04431324501968432
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3067484662576687,
"acc_stderr": 0.036230899157241474,
"acc_norm": 0.3067484662576687,
"acc_norm_stderr": 0.036230899157241474
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03894641120044792,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03894641120044792
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.26495726495726496,
"acc_stderr": 0.028911208802749475,
"acc_norm": 0.26495726495726496,
"acc_norm_stderr": 0.028911208802749475
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2707535121328225,
"acc_stderr": 0.015889888362560486,
"acc_norm": 0.2707535121328225,
"acc_norm_stderr": 0.015889888362560486
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.29190751445086704,
"acc_stderr": 0.02447699407624734,
"acc_norm": 0.29190751445086704,
"acc_norm_stderr": 0.02447699407624734
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24581005586592178,
"acc_stderr": 0.014400296429225586,
"acc_norm": 0.24581005586592178,
"acc_norm_stderr": 0.014400296429225586
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2829581993569132,
"acc_stderr": 0.02558306248998484,
"acc_norm": 0.2829581993569132,
"acc_norm_stderr": 0.02558306248998484
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.025407197798890155,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.025407197798890155
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.026469036818590638,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.026469036818590638
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2685788787483703,
"acc_stderr": 0.011320056629121734,
"acc_norm": 0.2685788787483703,
"acc_norm_stderr": 0.011320056629121734
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.19117647058823528,
"acc_stderr": 0.023886881922440362,
"acc_norm": 0.19117647058823528,
"acc_norm_stderr": 0.023886881922440362
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2761437908496732,
"acc_stderr": 0.018087276935663137,
"acc_norm": 0.2761437908496732,
"acc_norm_stderr": 0.018087276935663137
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072775,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072775
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24489795918367346,
"acc_stderr": 0.02752963744017492,
"acc_norm": 0.24489795918367346,
"acc_norm_stderr": 0.02752963744017492
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-virology|5": {
"acc": 0.19879518072289157,
"acc_stderr": 0.03106939026078942,
"acc_norm": 0.19879518072289157,
"acc_norm_stderr": 0.03106939026078942
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.28654970760233917,
"acc_stderr": 0.034678266857038266,
"acc_norm": 0.28654970760233917,
"acc_norm_stderr": 0.034678266857038266
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608774,
"mc2": 0.44852499686717784,
"mc2_stderr": 0.015613508500309613
},
"harness|winogrande|5": {
"acc": 0.5722178374112076,
"acc_stderr": 0.013905134013839955
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
tyzhu/eval_tag_nq_test_v12_first_1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: question
dtype: string
- name: title
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
- name: answers
struct:
- name: answer_start
sequence: 'null'
- name: text
sequence: string
- name: id
dtype: string
- name: titles
dtype: string
splits:
- name: train
num_bytes: 3310
num_examples: 10
- name: validation
num_bytes: 1306262
num_examples: 3610
download_size: 0
dataset_size: 1309572
---
# Dataset Card for "eval_tag_nq_test_v12_first_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gigant/tib_002 | ---
dataset_info:
features:
- name: doi
dtype: string
- name: title
dtype: string
- name: url
dtype: string
- name: video_url
dtype: string
- name: license
dtype: string
- name: subject
dtype: string
- name: genre
dtype: string
- name: release_year
dtype: string
- name: author
dtype: string
- name: contributors
dtype: string
- name: abstract
dtype: string
- name: transcript
dtype: string
- name: transcript_segments
sequence:
- name: id
dtype: int32
- name: seek
dtype: int32
- name: start
dtype: float32
- name: end
dtype: float32
- name: text
dtype: string
- name: tokens
sequence: int32
- name: temperature
dtype: float32
- name: avg_logprob
dtype: float32
- name: compression_ratio
dtype: float32
- name: no_speech_prob
dtype: float32
- name: keyframes
sequence:
- name: slide
dtype: string
- name: frames
sequence: int32
- name: timestamp
sequence: float32
- name: language
dtype: string
splits:
- name: train
num_bytes: 1011381643.8712058
num_examples: 8831
download_size: 486130872
dataset_size: 1011381643.8712058
---
# Dataset Card for "tib_002"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
maheboob/guanaco-llama-2-chat | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1655208
num_examples: 1000
download_size: 966969
dataset_size: 1655208
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-llama-2-chat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-conll2003-conll2003-bc26c9-1485554294 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- conll2003
eval_info:
task: entity_extraction
model: mariolinml/roberta_large-ner-conll2003_0818_v0
metrics: []
dataset_name: conll2003
dataset_config: conll2003
dataset_split: test
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: mariolinml/roberta_large-ner-conll2003_0818_v0
* Dataset: conll2003
* Config: conll2003
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
maahir-garg/refined-platypus | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4201526
num_examples: 1000
download_size: 2247080
dataset_size: 4201526
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
SEACrowd/nusaparagraph_topic | ---
tags:
- topic-modeling
language:
- btk
- bew
- bug
- jav
- mad
- mak
- min
- mui
- rej
- sun
---
# nusaparagraph_topic
Democratizing access to natural language processing (NLP) technology is crucial, especially for underrepresented and extremely low-resource languages. Previous research has focused on developing labeled and unlabeled corpora for these languages through online scraping and document translation. While these methods have proven effective and cost-efficient, we have identified limitations in the resulting corpora, including a lack of lexical diversity and cultural relevance to local communities. To address this gap, we conduct a case study on Indonesian local languages. We compare the effectiveness of online scraping, human translation, and paragraph writing by native speakers in constructing datasets. Our findings demonstrate that datasets generated through paragraph writing by native speakers exhibit superior quality in terms of lexical diversity and cultural content. In addition, we present the NusaWrites benchmark, encompassing 12 underrepresented and extremely low-resource languages spoken by millions of individuals in Indonesia. Our empirical experiment results using existing multilingual large language models conclude the need to extend these models to more underrepresented languages.
We introduce a novel high quality human curated corpora, i.e., NusaMenulis, which covers 12 languages spoken in Indonesia. The resource extend the coverage of languages to 5 new languages, i.e., Ambon (abs), Bima (bhp), Makassarese (mak), Palembang / Musi (mui), and Rejang (rej).
For the topic modeling task, we cover 8 topics, i.e., food \& beverages, sports, leisure, religion, culture \& heritage, a slice of life, technology, and business.
## Dataset Usage
Run `pip install nusacrowd` before loading the dataset through HuggingFace's `load_dataset`.
## Citation
```
@unpublished{anonymous2023nusawrites:,
title={NusaWrites: Constructing High-Quality Corpora for Underrepresented and Extremely Low-Resource Languages},
author={Anonymous},
journal={OpenReview Preprint},
year={2023},
note={anonymous preprint under review}
}
```
## License
Creative Commons Attribution Share-Alike 4.0 International
## Homepage
[https://github.com/IndoNLP/nusa-writes](https://github.com/IndoNLP/nusa-writes)
### NusaCatalogue
For easy indexing and metadata: [https://indonlp.github.io/nusa-catalogue](https://indonlp.github.io/nusa-catalogue) |
Dahoas/svamp | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 347184
num_examples: 700
- name: test
num_bytes: 148692
num_examples: 300
download_size: 169303
dataset_size: 495876
---
# Dataset Card for "svamp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-glue-e9a4b61a-13985914 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- glue
eval_info:
task: natural_language_inference
model: autoevaluate/glue-mrpc
metrics: []
dataset_name: glue
dataset_config: mrpc
dataset_split: validation
col_mapping:
text1: sentence1
text2: sentence2
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Natural Language Inference
* Model: autoevaluate/glue-mrpc
* Dataset: glue
* Config: mrpc
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
Divyanshu/indicxnli | ---
annotations_creators:
- machine-generated
language_creators:
- machine-generated
language:
- as
- bn
- gu
- hi
- kn
- ml
- mr
- or
- pa
- ta
- te
license:
- cc0-1.0
multilinguality:
- multilingual
pretty_name: IndicXNLI
size_categories:
- 1M<n<10M
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- natural-language-inference
---
# Dataset Card for "IndicXNLI"
## Table of Contents
- [Dataset Card for "IndicXNLI"](#dataset-card-for-indicxnli)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
## Dataset Description
- **Homepage:** <https://github.com/divyanshuaggarwal/IndicXNLI>
- **Paper:** [IndicXNLI: Evaluating Multilingual Inference for Indian Languages](https://arxiv.org/abs/2204.08776)
- **Point of Contact:** [Divyanshu Aggarwal](mailto:divyanshuggrwl@gmail.com)
### Dataset Summary
INDICXNLI is similar to existing
XNLI dataset in shape/form, but focusses on Indic language family. INDICXNLI include NLI
data for eleven major Indic languages that includes
Assamese (‘as’), Gujarat (‘gu’), Kannada (‘kn’),
Malayalam (‘ml’), Marathi (‘mr’), Odia (‘or’),
Punjabi (‘pa’), Tamil (‘ta’), Telugu (‘te’), Hindi
(‘hi’), and Bengali (‘bn’).
### Supported Tasks and Leaderboards
**Tasks:** Natural Language Inference
**Leaderboards:** Currently there is no Leaderboard for this dataset.
### Languages
- `Assamese (as)`
- `Bengali (bn)`
- `Gujarati (gu)`
- `Kannada (kn)`
- `Hindi (hi)`
- `Malayalam (ml)`
- `Marathi (mr)`
- `Oriya (or)`
- `Punjabi (pa)`
- `Tamil (ta)`
- `Telugu (te)`
## Dataset Structure
### Data Instances
One example from the `hi` dataset is given below in JSON format.
```python
{'premise': 'अवधारणात्मक रूप से क्रीम स्किमिंग के दो बुनियादी आयाम हैं-उत्पाद और भूगोल।',
'hypothesis': 'उत्पाद और भूगोल क्रीम स्किमिंग का काम करते हैं।',
'label': 1 (neutral) }
```
### Data Fields
- `premise (string)`: Premise Sentence
- `hypothesis (string)`: Hypothesis Sentence
- `label (integer)`: Integer label `0` if hypothesis `entails` the premise, `2` if hypothesis `negates` the premise and `1` otherwise.
### Data Splits
<!-- Below is the dataset split given for `hi` dataset.
```python
DatasetDict({
train: Dataset({
features: ['premise', 'hypothesis', 'label'],
num_rows: 392702
})
test: Dataset({
features: ['premise', 'hypothesis', 'label'],
num_rows: 5010
})
validation: Dataset({
features: ['premise', 'hypothesis', 'label'],
num_rows: 2490
})
})
``` -->
Language | ISO 639-1 Code |Train | Test | Dev |
--------------|----------------|-------|-----|------|
Assamese | as | 392,702 | 5,010 | 2,490 |
Bengali | bn | 392,702 | 5,010 | 2,490 |
Gujarati | gu | 392,702 | 5,010 | 2,490 |
Hindi | hi | 392,702 | 5,010 | 2,490 |
Kannada | kn | 392,702 | 5,010 | 2,490 |
Malayalam | ml |392,702 | 5,010 | 2,490 |
Marathi | mr |392,702 | 5,010 | 2,490 |
Oriya | or | 392,702 | 5,010 | 2,490 |
Punjabi | pa | 392,702 | 5,010 | 2,490 |
Tamil | ta | 392,702 | 5,010 | 2,490 |
Telugu | te | 392,702 | 5,010 | 2,490 |
<!-- The dataset split remains same across all languages. -->
## Dataset usage
Code snippet for using the dataset using datasets library.
```python
from datasets import load_dataset
dataset = load_dataset("Divyanshu/indicxnli")
```
## Dataset Creation
Machine translation of XNLI english dataset to 11 listed Indic Languages.
### Curation Rationale
[More information needed]
### Source Data
[XNLI dataset](https://cims.nyu.edu/~sbowman/xnli/)
#### Initial Data Collection and Normalization
[Detailed in the paper](https://arxiv.org/abs/2204.08776)
#### Who are the source language producers?
[Detailed in the paper](https://arxiv.org/abs/2204.08776)
#### Human Verification Process
[Detailed in the paper](https://arxiv.org/abs/2204.08776)
## Considerations for Using the Data
### Social Impact of Dataset
[Detailed in the paper](https://arxiv.org/abs/2204.08776)
### Discussion of Biases
[Detailed in the paper](https://arxiv.org/abs/2204.08776)
### Other Known Limitations
[Detailed in the paper](https://arxiv.org/abs/2204.08776)
### Dataset Curators
Divyanshu Aggarwal, Vivek Gupta, Anoop Kunchukuttan
### Licensing Information
Contents of this repository are restricted to only non-commercial research purposes under the [Creative Commons Attribution-NonCommercial 4.0 International License (CC BY-NC 4.0)](https://creativecommons.org/licenses/by-nc/4.0/). Copyright of the dataset contents belongs to the original copyright holders.
### Citation Information
If you use any of the datasets, models or code modules, please cite the following paper:
```
@misc{https://doi.org/10.48550/arxiv.2204.08776,
doi = {10.48550/ARXIV.2204.08776},
url = {https://arxiv.org/abs/2204.08776},
author = {Aggarwal, Divyanshu and Gupta, Vivek and Kunchukuttan, Anoop},
keywords = {Computation and Language (cs.CL), Artificial Intelligence (cs.AI), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {IndicXNLI: Evaluating Multilingual Inference for Indian Languages},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
```
<!-- ### Contributions -->
|
tay-yozhik/SyntheticTexts | ---
license: apache-2.0
language:
- ru
dataset_info:
features:
- name: text
dtype: string
---
# Dataset of synthetic texts
The dataset contains 2 million sentences that were automatically generated according to [rules written in a context-free grammar](https://github.com/GorbachevaTaisia/JSGF_generative_grammar]).
The formed sentences are correct from the point of view of the morphological and syntactic rules of the Russian language, semantics was not taken into account.
In addition, they do not contain other characters, except for the Cyrillic alphabet.
Sample sentences (original lack of punctuation and capitalization retained):
1. *либерализуем, закапывая между потеплениями*;
2. *не раньше календаря, обкидавши и хотелось бы изобрести, чтобы приручить поглубже*;
3. *а вам только стыдно идти фыркать военизированным*;
4. *почти притупляешь невинный сто первый успех, чуть не сбиваешь с толку шестиклассника из краснознаменской дидактики, сто десятые твои даже рекламируют и ляпнут ведь*;
5. *как будто отпиленные обычно становятся, если гофрируете*;
6. *вот я хвастаюсь*;
7. *едва поскрипнув справа от коллекции, чтобы лечь, решил убийственно счастливое создание видеообращения, спешащего почти с утечкой мини-футбола*;
8. *после разговора с врачами мы какие-то хорошие*;
9. *и под турином, драки и бездетность под поволжьем, распоясавшись, выяснили*;
10. *как будто тебя зовут не ты*.
|
ruliad/jsonl-expert-corpus | ---
dataset_info:
features:
- name: text
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 13313331192
num_examples: 5183886
download_size: 4153005422
dataset_size: 13313331192
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
BNNT/IPQuiz | ---
license: cc-by-nc-sa-4.0
---
The IPQuiz dataset is used to assess a model's understanding of intellectual property-related concepts and regulations.IPQuiz is a multiple-choice question-response dataset collected from publicly available websites around the world in a variety of languages. For each question, the model needs to select an answer from a candidate list.
## source:
http://epaper.iprchn.com/zscqb/h5/html5/2023-04/21/content_27601_7600799.htm
https://eduai.baidu.com/view/6f7c32b0d7d8d15abe23482fb4daa58da0111cfe
https://www.juxuewen.com/question/1159.html
https://eduai.baidu.com/view/9c690b1d29f90242a8956bec0975f46527d3a7c9
https://wenku.baidu.com/view/daea1b562179168884868762caaedd3383c4b5d2.html?_wkts_=1683278279098
http://www.dxh.gov.cn/hdjl/zxdc/zxdcxq/index.shtml?/personalCenter/answerSheet/answerSheet.html?metadataId=ff80808180b79f8a01830b603e8001f7&siteId=43
http://guangxi.china.com.cn/2022-06/17/content_42002807.html
http://www.educhenggong.com/Uploads/attached/file/20210324/1616572571695330.pdf
https://www.gov.cn/guoqing/2021-10/29/content_5647633.htm
https://www.gov.cn/zhengce/2020-12/26/content_5573623.htm
https://www.gov.cn/gongbao/content/2000/content_60431.htm
https://www.gov.cn/zhengce/2020-12/26/content_5574514.htm
https://www.gov.cn/zhengce/2020-12/26/content_5573535.htm
https://www.gov.cn/zhengce/2020-12/26/content_5573535.htm
https://www.gov.cn/xinwen/2017-11/05/content_5237325.htm
https://www.gov.cn/zhengce/2020-12/26/content_5574414.htm
https://inside.nku.edu/content/dam/creative-thinking/docs/CT-Handouts-page/CT%20Handout%20Intellectual%20Property%20Quiz.pdf
https://www.wipo.int/ip-outreach/en/ipday/2017/ipday2017_quiz-copy.html
https://www.gastao.eu/ip-knowledge-test
https://www.proprofs.com/quiz-school/quizzes/intellectual-property-law-quiz
https://fr.surveymonkey.com/r/LW83BBV
https://www.riddle.com/view/57770?qzzr=1
https://about.lens.org/patent-knowledge-quiz/
https://www.examsegg.com/intellectual-property-rights-india-questions.html
https://www.rkdewan.com/quizstart.php?qid=24
https://www.eduki.ch/en/quizz/intellectual-property-quiz
https://qpkendra.com/mcq/ipr-and-patenting-mcq-pg-1.html
https://openstax.org/books/introduction-intellectual-property/pages/chapter-1
https://www.lexifiche.com/quiz-propriete-intellectuelle-breve
https://www.q-net.or.kr/cst003.do?id=cst00309&gSite=L&gId=51
https://www.agaroot.jp/benri/column/past-questions/
https://www.geo.de/wissen/quiz/wissenstest-erfindungen-i-30201276.html
https://www.geo.de/wissen/quiz/wissenstest-erfindungen-ii-30201270.html
https://www.eduki.ch/de/quizz/quiz-geistiges-eigentum
https://www.fsgu-akademie.de/quiz/geistiges-eigentum-teil-1/
https://www.fsgu-akademie.de/quiz/geistiges-eigentum-teil-2/
https://www.fsgu-akademie.de/quiz/geistiges-eigentum-teil-3/
https://www.fsgu-akademie.de/quiz/geistiges-eigentum-teil-4/
https://www.fsgu-akademie.de/quiz/geistiges-eigentum-teil-5/
https://www.fsgu-akademie.de/quiz/geistiges-eigentum-teil-6/
https://www.fsgu-akademie.de/quiz/geistiges-eigentum-teil-7/
https://www.fsgu-akademie.de/quiz/geistiges-eigentum-teil-8/
https://www.fsgu-akademie.de/quiz/geistiges-eigentum-teil-9/
https://www.fsgu-akademie.de/quiz/geistiges-eigentum-teil-10/
https://www.fsgu-akademie.de/quiz/design-schutzrecht-teil-1/
https://www.fsgu-akademie.de/quiz/design-schutzrecht-teil-2/
https://www.fsgu-akademie.de/quiz/design-schutzrecht-teil-3/
https://www.fsgu-akademie.de/quiz/design-schutzrecht-teil-4/
https://www.fsgu-akademie.de/quiz/handelsmarke-teil-1/
https://www.fsgu-akademie.de/quiz/handelsmarke-teil-2/
https://www.fsgu-akademie.de/quiz/handelsmarke-teil-3/
https://www.fsgu-akademie.de/quiz/handelsmarke-teil-4/
https://www.fsgu-akademie.de/quiz/patentrecht-deutschland-teil-1/
https://www.fsgu-akademie.de/quiz/patentrecht-deutschland-teil-2/
https://www.fsgu-akademie.de/quiz/patentrecht-deutschland-teil-3/
https://www.fsgu-akademie.de/quiz/patentrecht-deutschland-teil-4/
https://www.fsgu-akademie.de/quiz/urheberrecht-teil-1/
https://www.fsgu-akademie.de/quiz/urheberrecht-teil-2/
https://www.fsgu-akademie.de/quiz/urheberrecht-teil-3/
https://www.fsgu-akademie.de/quiz/urheberrecht-teil-4/ |
OBF/contest-passed-samples | ---
dataset_info:
- config_name: c
features:
- name: content
dtype: string
splits:
- name: train
num_bytes: 156639565
num_examples: 207238
download_size: 64950019
dataset_size: 156639565
- config_name: cpp
features:
- name: content
dtype: string
splits:
- name: train
num_bytes: 910672622
num_examples: 728924
download_size: 418773306
dataset_size: 910672622
- config_name: csharp
features:
- name: content
dtype: string
splits:
- name: train
num_bytes: 96598999
num_examples: 63292
download_size: 30177567
dataset_size: 96598999
- config_name: d
features:
- name: content
dtype: string
splits:
- name: train
num_bytes: 6087578
num_examples: 6957
download_size: 2457668
dataset_size: 6087578
- config_name: go
features:
- name: content
dtype: string
splits:
- name: train
num_bytes: 10932442
num_examples: 14534
download_size: 4467434
dataset_size: 10932442
- config_name: haskell
features:
- name: content
dtype: string
splits:
- name: train
num_bytes: 19868160
num_examples: 36709
download_size: 8822148
dataset_size: 19868160
- config_name: java
features:
- name: content
dtype: string
splits:
- name: train
num_bytes: 491585062
num_examples: 313196
download_size: 174444095
dataset_size: 491585062
- config_name: javascript
features:
- name: content
dtype: string
splits:
- name: train
num_bytes: 14174026
num_examples: 22962
download_size: 5799613
dataset_size: 14174026
- config_name: julia
features:
- name: content
dtype: string
splits:
- name: train
num_bytes: 1160100
num_examples: 3410
download_size: 503328
dataset_size: 1160100
- config_name: kotlin
features:
- name: content
dtype: string
splits:
- name: train
num_bytes: 7746876
num_examples: 11636
download_size: 2922306
dataset_size: 7746876
- config_name: php
features:
- name: content
dtype: string
splits:
- name: train
num_bytes: 7789714
num_examples: 16353
download_size: 3229847
dataset_size: 7789714
- config_name: python
features:
- name: content
dtype: string
splits:
- name: train
num_bytes: 576852508
num_examples: 1225624
download_size: 241813130
dataset_size: 576852508
- config_name: ruby
features:
- name: content
dtype: string
splits:
- name: train
num_bytes: 30450216
num_examples: 106124
download_size: 14020141
dataset_size: 30450216
- config_name: rust
features:
- name: content
dtype: string
splits:
- name: train
num_bytes: 15844013
num_examples: 11174
download_size: 5785746
dataset_size: 15844013
- config_name: scala
features:
- name: content
dtype: string
splits:
- name: train
num_bytes: 5193115
num_examples: 8487
download_size: 2053399
dataset_size: 5193115
- config_name: swift
features:
- name: content
dtype: string
splits:
- name: train
num_bytes: 756444
num_examples: 1739
download_size: 302746
dataset_size: 756444
- config_name: typescript
features:
- name: content
dtype: string
splits:
- name: train
num_bytes: 1029076
num_examples: 1671
download_size: 403980
dataset_size: 1029076
configs:
- config_name: c
data_files:
- split: train
path: c/train-*
- config_name: cpp
data_files:
- split: train
path: cpp/train-*
- config_name: csharp
data_files:
- split: train
path: csharp/train-*
- config_name: d
data_files:
- split: train
path: d/train-*
- config_name: go
data_files:
- split: train
path: go/train-*
- config_name: haskell
data_files:
- split: train
path: haskell/train-*
- config_name: java
data_files:
- split: train
path: java/train-*
- config_name: javascript
data_files:
- split: train
path: javascript/train-*
- config_name: julia
data_files:
- split: train
path: julia/train-*
- config_name: kotlin
data_files:
- split: train
path: kotlin/train-*
- config_name: php
data_files:
- split: train
path: php/train-*
- config_name: python
data_files:
- split: train
path: python/train-*
- config_name: ruby
data_files:
- split: train
path: ruby/train-*
- config_name: rust
data_files:
- split: train
path: rust/train-*
- config_name: scala
data_files:
- split: train
path: scala/train-*
- config_name: swift
data_files:
- split: train
path: swift/train-*
- config_name: typescript
data_files:
- split: train
path: typescript/train-*
---
|
open-llm-leaderboard/details_uukuguy__speechless-orca-platypus-coig-lite-4k-0.6e-13b | ---
pretty_name: Evaluation run of uukuguy/speechless-orca-platypus-coig-lite-4k-0.6e-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [uukuguy/speechless-orca-platypus-coig-lite-4k-0.6e-13b](https://huggingface.co/uukuguy/speechless-orca-platypus-coig-lite-4k-0.6e-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-orca-platypus-coig-lite-4k-0.6e-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-29T12:34:15.259983](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-orca-platypus-coig-lite-4k-0.6e-13b/blob/main/results_2023-10-29T12-34-15.259983.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3847525167785235,\n\
\ \"em_stderr\": 0.004982591799399597,\n \"f1\": 0.4459479865771823,\n\
\ \"f1_stderr\": 0.004798591730535464,\n \"acc\": 0.40086514083221786,\n\
\ \"acc_stderr\": 0.00878469848110711\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.3847525167785235,\n \"em_stderr\": 0.004982591799399597,\n\
\ \"f1\": 0.4459479865771823,\n \"f1_stderr\": 0.004798591730535464\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04245640636846096,\n \
\ \"acc_stderr\": 0.005553837749990046\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7592738752959748,\n \"acc_stderr\": 0.012015559212224174\n\
\ }\n}\n```"
repo_url: https://huggingface.co/uukuguy/speechless-orca-platypus-coig-lite-4k-0.6e-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|arc:challenge|25_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|arc:challenge|25_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_29T12_34_15.259983
path:
- '**/details_harness|drop|3_2023-10-29T12-34-15.259983.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-29T12-34-15.259983.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_29T12_34_15.259983
path:
- '**/details_harness|gsm8k|5_2023-10-29T12-34-15.259983.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-29T12-34-15.259983.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hellaswag|10_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hellaswag|10_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_29T12_34_15.259983
path:
- '**/details_harness|winogrande|5_2023-10-29T12-34-15.259983.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-29T12-34-15.259983.parquet'
- config_name: results
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- results_2023-08-31T13:16:07.085332.parquet
- split: 2023_08_31T13_45_32.435027
path:
- results_2023-08-31T13:45:32.435027.parquet
- split: 2023_10_29T12_34_15.259983
path:
- results_2023-10-29T12-34-15.259983.parquet
- split: latest
path:
- results_2023-10-29T12-34-15.259983.parquet
---
# Dataset Card for Evaluation run of uukuguy/speechless-orca-platypus-coig-lite-4k-0.6e-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/uukuguy/speechless-orca-platypus-coig-lite-4k-0.6e-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [uukuguy/speechless-orca-platypus-coig-lite-4k-0.6e-13b](https://huggingface.co/uukuguy/speechless-orca-platypus-coig-lite-4k-0.6e-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-orca-platypus-coig-lite-4k-0.6e-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T12:34:15.259983](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-orca-platypus-coig-lite-4k-0.6e-13b/blob/main/results_2023-10-29T12-34-15.259983.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.3847525167785235,
"em_stderr": 0.004982591799399597,
"f1": 0.4459479865771823,
"f1_stderr": 0.004798591730535464,
"acc": 0.40086514083221786,
"acc_stderr": 0.00878469848110711
},
"harness|drop|3": {
"em": 0.3847525167785235,
"em_stderr": 0.004982591799399597,
"f1": 0.4459479865771823,
"f1_stderr": 0.004798591730535464
},
"harness|gsm8k|5": {
"acc": 0.04245640636846096,
"acc_stderr": 0.005553837749990046
},
"harness|winogrande|5": {
"acc": 0.7592738752959748,
"acc_stderr": 0.012015559212224174
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Ransaka/aya_sinhala_subset | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: annotation_type
dtype: string
- name: user_id
dtype: string
splits:
- name: train
num_bytes: 17985681.039038565
num_examples: 14524
download_size: 7173921
dataset_size: 17985681.039038565
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jdowni80/babugroup_llamology_embeddings | ---
dataset_info:
features:
- name: title
dtype: string
- name: page
dtype: float64
- name: content
dtype: string
- name: type
dtype: string
splits:
- name: train
num_bytes: 918062
num_examples: 1150
download_size: 403162
dataset_size: 918062
---
# Dataset Card for "babugroup_llamology_embeddings"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
naem1023/augmented-concat-100000 | ---
license: apache-2.0
---
|
Umal-exvc/chocolate-captioned-dataset-200 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 39998461.0
num_examples: 200
download_size: 39150206
dataset_size: 39998461.0
---
# Dataset Card for "chocolate-captioned-dataset-200"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_BEE-spoke-data__smol_llama-101M-GQA | ---
pretty_name: Evaluation run of BEE-spoke-data/smol_llama-101M-GQA
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BEE-spoke-data/smol_llama-101M-GQA](https://huggingface.co/BEE-spoke-data/smol_llama-101M-GQA)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BEE-spoke-data__smol_llama-101M-GQA_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-18T22:28:51.599216](https://huggingface.co/datasets/open-llm-leaderboard/details_BEE-spoke-data__smol_llama-101M-GQA_public/blob/main/results_2023-11-18T22-28-51.599216.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.243457145392589,\n\
\ \"acc_stderr\": 0.030232451207481324,\n \"acc_norm\": 0.2440460125546807,\n\
\ \"acc_norm_stderr\": 0.03099855104029764,\n \"mc1\": 0.2484700122399021,\n\
\ \"mc1_stderr\": 0.01512742709652069,\n \"mc2\": 0.45801037294841895,\n\
\ \"mc2_stderr\": 0.01513659314586415,\n \"em\": 0.0016778523489932886,\n\
\ \"em_stderr\": 0.0004191330178826889,\n \"f1\": 0.03420931208053696,\n\
\ \"f1_stderr\": 0.0011030675027452802\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.18088737201365188,\n \"acc_stderr\": 0.011248574467407024,\n\
\ \"acc_norm\": 0.23464163822525597,\n \"acc_norm_stderr\": 0.012383873560768675\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2789285002987453,\n\
\ \"acc_stderr\": 0.004475557360359701,\n \"acc_norm\": 0.287293367855009,\n\
\ \"acc_norm_stderr\": 0.004515748192605715\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2740740740740741,\n\
\ \"acc_stderr\": 0.03853254836552004,\n \"acc_norm\": 0.2740740740740741,\n\
\ \"acc_norm_stderr\": 0.03853254836552004\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.20394736842105263,\n \"acc_stderr\": 0.03279000406310052,\n\
\ \"acc_norm\": 0.20394736842105263,\n \"acc_norm_stderr\": 0.03279000406310052\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.19,\n\
\ \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.19,\n \
\ \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.22641509433962265,\n \"acc_stderr\": 0.025757559893106737,\n\
\ \"acc_norm\": 0.22641509433962265,\n \"acc_norm_stderr\": 0.025757559893106737\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n\
\ \"acc_stderr\": 0.037161774375660164,\n \"acc_norm\": 0.2708333333333333,\n\
\ \"acc_norm_stderr\": 0.037161774375660164\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.15,\n \"acc_stderr\": 0.035887028128263714,\n \
\ \"acc_norm\": 0.15,\n \"acc_norm_stderr\": 0.035887028128263714\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\"\
: 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.1907514450867052,\n\
\ \"acc_stderr\": 0.029957851329869337,\n \"acc_norm\": 0.1907514450867052,\n\
\ \"acc_norm_stderr\": 0.029957851329869337\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.23829787234042554,\n \"acc_stderr\": 0.02785125297388977,\n\
\ \"acc_norm\": 0.23829787234042554,\n \"acc_norm_stderr\": 0.02785125297388977\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.0404933929774814,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.0404933929774814\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135303,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135303\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.17989417989417988,\n \"acc_stderr\": 0.019782119832766426,\n \"\
acc_norm\": 0.17989417989417988,\n \"acc_norm_stderr\": 0.019782119832766426\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.038522733649243156,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.038522733649243156\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.3064516129032258,\n \"acc_stderr\": 0.026226485652553873,\n \"\
acc_norm\": 0.3064516129032258,\n \"acc_norm_stderr\": 0.026226485652553873\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.28078817733990147,\n \"acc_stderr\": 0.0316185633535861,\n \"\
acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.0316185633535861\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.16,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\"\
: 0.16,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.26262626262626265,\n \"acc_stderr\": 0.031353050095330855,\n \"\
acc_norm\": 0.26262626262626265,\n \"acc_norm_stderr\": 0.031353050095330855\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.34196891191709844,\n \"acc_stderr\": 0.03423465100104281,\n\
\ \"acc_norm\": 0.34196891191709844,\n \"acc_norm_stderr\": 0.03423465100104281\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.30256410256410254,\n \"acc_stderr\": 0.023290888053772725,\n\
\ \"acc_norm\": 0.30256410256410254,\n \"acc_norm_stderr\": 0.023290888053772725\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2815126050420168,\n \"acc_stderr\": 0.02921354941437216,\n \
\ \"acc_norm\": 0.2815126050420168,\n \"acc_norm_stderr\": 0.02921354941437216\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436775,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22752293577981653,\n \"acc_stderr\": 0.017974463578776502,\n \"\
acc_norm\": 0.22752293577981653,\n \"acc_norm_stderr\": 0.017974463578776502\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.03350991604696043,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.03350991604696043\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24509803921568626,\n \"acc_stderr\": 0.030190282453501943,\n \"\
acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.030190282453501943\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2489451476793249,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.2489451476793249,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.21973094170403587,\n\
\ \"acc_stderr\": 0.027790177064383605,\n \"acc_norm\": 0.21973094170403587,\n\
\ \"acc_norm_stderr\": 0.027790177064383605\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.20610687022900764,\n \"acc_stderr\": 0.03547771004159463,\n\
\ \"acc_norm\": 0.20610687022900764,\n \"acc_norm_stderr\": 0.03547771004159463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2892561983471074,\n \"acc_stderr\": 0.041391127276354626,\n \"\
acc_norm\": 0.2892561983471074,\n \"acc_norm_stderr\": 0.041391127276354626\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n\
\ \"acc_stderr\": 0.041331194402438376,\n \"acc_norm\": 0.24074074074074073,\n\
\ \"acc_norm_stderr\": 0.041331194402438376\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.24539877300613497,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.20535714285714285,\n\
\ \"acc_stderr\": 0.03834241021419073,\n \"acc_norm\": 0.20535714285714285,\n\
\ \"acc_norm_stderr\": 0.03834241021419073\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.21359223300970873,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.21359223300970873,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26436781609195403,\n\
\ \"acc_stderr\": 0.01576998484069052,\n \"acc_norm\": 0.26436781609195403,\n\
\ \"acc_norm_stderr\": 0.01576998484069052\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.023445826276545546,\n\
\ \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.023445826276545546\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.023805186524888135,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.023805186524888135\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.28938906752411575,\n\
\ \"acc_stderr\": 0.025755865922632924,\n \"acc_norm\": 0.28938906752411575,\n\
\ \"acc_norm_stderr\": 0.025755865922632924\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.02378858355165855,\n\
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.02378858355165855\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2695035460992908,\n \"acc_stderr\": 0.02646903681859063,\n \
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.02646903681859063\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.01099615663514269,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.01099615663514269\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.27941176470588236,\n \"acc_stderr\": 0.027257202606114944,\n\
\ \"acc_norm\": 0.27941176470588236,\n \"acc_norm_stderr\": 0.027257202606114944\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.26633986928104575,\n \"acc_stderr\": 0.0178831881346672,\n \
\ \"acc_norm\": 0.26633986928104575,\n \"acc_norm_stderr\": 0.0178831881346672\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n\
\ \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n\
\ \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.33877551020408164,\n \"acc_stderr\": 0.030299506562154185,\n\
\ \"acc_norm\": 0.33877551020408164,\n \"acc_norm_stderr\": 0.030299506562154185\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.21393034825870647,\n\
\ \"acc_stderr\": 0.028996909693328923,\n \"acc_norm\": 0.21393034825870647,\n\
\ \"acc_norm_stderr\": 0.028996909693328923\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21084337349397592,\n\
\ \"acc_stderr\": 0.0317555478662992,\n \"acc_norm\": 0.21084337349397592,\n\
\ \"acc_norm_stderr\": 0.0317555478662992\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2046783625730994,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.2046783625730994,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2484700122399021,\n\
\ \"mc1_stderr\": 0.01512742709652069,\n \"mc2\": 0.45801037294841895,\n\
\ \"mc2_stderr\": 0.01513659314586415\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5067087608524072,\n \"acc_stderr\": 0.014051220692330352\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.0016778523489932886,\n \
\ \"em_stderr\": 0.0004191330178826889,\n \"f1\": 0.03420931208053696,\n\
\ \"f1_stderr\": 0.0011030675027452802\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.0075815011372251705,\n \"acc_stderr\": 0.0023892815120772244\n\
\ }\n}\n```"
repo_url: https://huggingface.co/BEE-spoke-data/smol_llama-101M-GQA
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|arc:challenge|25_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|arc:challenge|25_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|drop|3_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|drop|3_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|gsm8k|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|gsm8k|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hellaswag|10_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hellaswag|10_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-18T14-04-20.381972.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-18T22-28-51.599216.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-18T22-28-51.599216.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- '**/details_harness|winogrande|5_2023-11-18T14-04-20.381972.parquet'
- split: 2023_11_18T22_28_51.599216
path:
- '**/details_harness|winogrande|5_2023-11-18T22-28-51.599216.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-18T22-28-51.599216.parquet'
- config_name: results
data_files:
- split: 2023_11_18T14_04_20.381972
path:
- results_2023-11-18T14-04-20.381972.parquet
- split: 2023_11_18T22_28_51.599216
path:
- results_2023-11-18T22-28-51.599216.parquet
- split: latest
path:
- results_2023-11-18T22-28-51.599216.parquet
---
# Dataset Card for Evaluation run of BEE-spoke-data/smol_llama-101M-GQA
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/BEE-spoke-data/smol_llama-101M-GQA
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [BEE-spoke-data/smol_llama-101M-GQA](https://huggingface.co/BEE-spoke-data/smol_llama-101M-GQA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BEE-spoke-data__smol_llama-101M-GQA_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-18T22:28:51.599216](https://huggingface.co/datasets/open-llm-leaderboard/details_BEE-spoke-data__smol_llama-101M-GQA_public/blob/main/results_2023-11-18T22-28-51.599216.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.243457145392589,
"acc_stderr": 0.030232451207481324,
"acc_norm": 0.2440460125546807,
"acc_norm_stderr": 0.03099855104029764,
"mc1": 0.2484700122399021,
"mc1_stderr": 0.01512742709652069,
"mc2": 0.45801037294841895,
"mc2_stderr": 0.01513659314586415,
"em": 0.0016778523489932886,
"em_stderr": 0.0004191330178826889,
"f1": 0.03420931208053696,
"f1_stderr": 0.0011030675027452802
},
"harness|arc:challenge|25": {
"acc": 0.18088737201365188,
"acc_stderr": 0.011248574467407024,
"acc_norm": 0.23464163822525597,
"acc_norm_stderr": 0.012383873560768675
},
"harness|hellaswag|10": {
"acc": 0.2789285002987453,
"acc_stderr": 0.004475557360359701,
"acc_norm": 0.287293367855009,
"acc_norm_stderr": 0.004515748192605715
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.03853254836552004,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.03853254836552004
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.20394736842105263,
"acc_stderr": 0.03279000406310052,
"acc_norm": 0.20394736842105263,
"acc_norm_stderr": 0.03279000406310052
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22641509433962265,
"acc_stderr": 0.025757559893106737,
"acc_norm": 0.22641509433962265,
"acc_norm_stderr": 0.025757559893106737
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.037161774375660164,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.037161774375660164
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.15,
"acc_stderr": 0.035887028128263714,
"acc_norm": 0.15,
"acc_norm_stderr": 0.035887028128263714
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.1907514450867052,
"acc_stderr": 0.029957851329869337,
"acc_norm": 0.1907514450867052,
"acc_norm_stderr": 0.029957851329869337
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.23829787234042554,
"acc_stderr": 0.02785125297388977,
"acc_norm": 0.23829787234042554,
"acc_norm_stderr": 0.02785125297388977
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.0404933929774814,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.0404933929774814
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135303,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135303
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.17989417989417988,
"acc_stderr": 0.019782119832766426,
"acc_norm": 0.17989417989417988,
"acc_norm_stderr": 0.019782119832766426
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.038522733649243156,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.038522733649243156
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3064516129032258,
"acc_stderr": 0.026226485652553873,
"acc_norm": 0.3064516129032258,
"acc_norm_stderr": 0.026226485652553873
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.0316185633535861,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.0316185633535861
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.16,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.16,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.26262626262626265,
"acc_stderr": 0.031353050095330855,
"acc_norm": 0.26262626262626265,
"acc_norm_stderr": 0.031353050095330855
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.34196891191709844,
"acc_stderr": 0.03423465100104281,
"acc_norm": 0.34196891191709844,
"acc_norm_stderr": 0.03423465100104281
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.30256410256410254,
"acc_stderr": 0.023290888053772725,
"acc_norm": 0.30256410256410254,
"acc_norm_stderr": 0.023290888053772725
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2815126050420168,
"acc_stderr": 0.02921354941437216,
"acc_norm": 0.2815126050420168,
"acc_norm_stderr": 0.02921354941437216
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436775,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22752293577981653,
"acc_stderr": 0.017974463578776502,
"acc_norm": 0.22752293577981653,
"acc_norm_stderr": 0.017974463578776502
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.03350991604696043,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.03350991604696043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.030190282453501943,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.030190282453501943
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2489451476793249,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.2489451476793249,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.21973094170403587,
"acc_stderr": 0.027790177064383605,
"acc_norm": 0.21973094170403587,
"acc_norm_stderr": 0.027790177064383605
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.20610687022900764,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.20610687022900764,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2892561983471074,
"acc_stderr": 0.041391127276354626,
"acc_norm": 0.2892561983471074,
"acc_norm_stderr": 0.041391127276354626
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.041331194402438376,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.041331194402438376
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.24539877300613497,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.24539877300613497,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.20535714285714285,
"acc_stderr": 0.03834241021419073,
"acc_norm": 0.20535714285714285,
"acc_norm_stderr": 0.03834241021419073
},
"harness|hendrycksTest-management|5": {
"acc": 0.21359223300970873,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.21359223300970873,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26436781609195403,
"acc_stderr": 0.01576998484069052,
"acc_norm": 0.26436781609195403,
"acc_norm_stderr": 0.01576998484069052
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.023445826276545546,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.023445826276545546
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.023805186524888135,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.023805186524888135
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.28938906752411575,
"acc_stderr": 0.025755865922632924,
"acc_norm": 0.28938906752411575,
"acc_norm_stderr": 0.025755865922632924
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.02378858355165855,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.02378858355165855
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.02646903681859063,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.02646903681859063
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.01099615663514269,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.01099615663514269
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.27941176470588236,
"acc_stderr": 0.027257202606114944,
"acc_norm": 0.27941176470588236,
"acc_norm_stderr": 0.027257202606114944
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26633986928104575,
"acc_stderr": 0.0178831881346672,
"acc_norm": 0.26633986928104575,
"acc_norm_stderr": 0.0178831881346672
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.33877551020408164,
"acc_stderr": 0.030299506562154185,
"acc_norm": 0.33877551020408164,
"acc_norm_stderr": 0.030299506562154185
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.21393034825870647,
"acc_stderr": 0.028996909693328923,
"acc_norm": 0.21393034825870647,
"acc_norm_stderr": 0.028996909693328923
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21084337349397592,
"acc_stderr": 0.0317555478662992,
"acc_norm": 0.21084337349397592,
"acc_norm_stderr": 0.0317555478662992
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2046783625730994,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.2046783625730994,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2484700122399021,
"mc1_stderr": 0.01512742709652069,
"mc2": 0.45801037294841895,
"mc2_stderr": 0.01513659314586415
},
"harness|winogrande|5": {
"acc": 0.5067087608524072,
"acc_stderr": 0.014051220692330352
},
"harness|drop|3": {
"em": 0.0016778523489932886,
"em_stderr": 0.0004191330178826889,
"f1": 0.03420931208053696,
"f1_stderr": 0.0011030675027452802
},
"harness|gsm8k|5": {
"acc": 0.0075815011372251705,
"acc_stderr": 0.0023892815120772244
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
LahiruLowe/cot_explanation_targets_mosaicml-mpt-7b-8k-instruct | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: task_source
dtype: string
- name: task_name
dtype: string
- name: template_type
dtype: string
- name: explained_targets
dtype: string
splits:
- name: train
num_bytes: 15709
num_examples: 35
download_size: 13037
dataset_size: 15709
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "cot_explanation_targets_mosaicml-mpt-7b-8k-instruct"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KnutJaegersberg/WizardLM_evol_instruct_V2_196k_instruct_format | ---
license: cc-by-nc-4.0
---
The moralization detection is heuristic, keyword based using the usual list and limited to the first 400 characters or so to not throw away a lot of false positives. |
aisyahhrazak/crawl-fiksyenshasha | ---
language:
- ms
---
About
- Data scraped from https://fiksyenshasha.com/ |
AdapterOcean/gorilla_16k_standardized_cluster_3 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 30046179
num_examples: 3326
download_size: 7772997
dataset_size: 30046179
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "gorilla_16k_standardized_cluster_3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Wrathfulreap/Wrath001 | ---
license: apache-2.0
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and leaderboards
### Languages
English
## Dataset Structure
categorized
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
08/18/2023
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerati0ons for Using the Data
no illigal content
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Apache 2.0
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
GEM-submissions/GEM__bart_base_schema_guided_dialog__1645547915 | ---
benchmark: gem
type: prediction
submission_name: BART_BASE_schema_guided_dialog
---
|
open-llm-leaderboard/details_YouKnowMee__Mistral-7b-instruct-v0.2-summ-sft-dpo-ed2 | ---
pretty_name: Evaluation run of YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-dpo-ed2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-dpo-ed2](https://huggingface.co/YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-dpo-ed2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_YouKnowMee__Mistral-7b-instruct-v0.2-summ-sft-dpo-ed2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-23T17:04:25.517599](https://huggingface.co/datasets/open-llm-leaderboard/details_YouKnowMee__Mistral-7b-instruct-v0.2-summ-sft-dpo-ed2/blob/main/results_2024-01-23T17-04-25.517599.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6490724115520589,\n\
\ \"acc_stderr\": 0.032259594360133925,\n \"acc_norm\": 0.648421745854093,\n\
\ \"acc_norm_stderr\": 0.03293377935819422,\n \"mc1\": 0.602203182374541,\n\
\ \"mc1_stderr\": 0.01713393424855968,\n \"mc2\": 0.7273114161495677,\n\
\ \"mc2_stderr\": 0.014814015917833025\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7192832764505119,\n \"acc_stderr\": 0.01313123812697558,\n\
\ \"acc_norm\": 0.7406143344709898,\n \"acc_norm_stderr\": 0.012808273573927106\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7392949611631149,\n\
\ \"acc_stderr\": 0.0043812204096411725,\n \"acc_norm\": 0.8924517028480382,\n\
\ \"acc_norm_stderr\": 0.003091759094519539\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337135,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337135\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268542,\n \"\
acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268542\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.01584825580650155,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.01584825580650155\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137296,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137296\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.038498560987940904,\n \"\
acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.038498560987940904\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.03192193448934724,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.03192193448934724\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993459,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993459\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43687150837988825,\n\
\ \"acc_stderr\": 0.016588680864530626,\n \"acc_norm\": 0.43687150837988825,\n\
\ \"acc_norm_stderr\": 0.016588680864530626\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.4667535853976532,\n \"acc_stderr\": 0.01274197433389723,\n\
\ \"acc_norm\": 0.4667535853976532,\n \"acc_norm_stderr\": 0.01274197433389723\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n \"\
acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6601307189542484,\n \"acc_stderr\": 0.019162418588623553,\n \
\ \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.019162418588623553\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.602203182374541,\n\
\ \"mc1_stderr\": 0.01713393424855968,\n \"mc2\": 0.7273114161495677,\n\
\ \"mc2_stderr\": 0.014814015917833025\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8468823993685872,\n \"acc_stderr\": 0.010120623252272982\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6641394996209249,\n \
\ \"acc_stderr\": 0.013009224714267362\n }\n}\n```"
repo_url: https://huggingface.co/YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-dpo-ed2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|arc:challenge|25_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|gsm8k|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hellaswag|10_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T17-04-25.517599.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T17-04-25.517599.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- '**/details_harness|winogrande|5_2024-01-23T17-04-25.517599.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-23T17-04-25.517599.parquet'
- config_name: results
data_files:
- split: 2024_01_23T17_04_25.517599
path:
- results_2024-01-23T17-04-25.517599.parquet
- split: latest
path:
- results_2024-01-23T17-04-25.517599.parquet
---
# Dataset Card for Evaluation run of YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-dpo-ed2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-dpo-ed2](https://huggingface.co/YouKnowMee/Mistral-7b-instruct-v0.2-summ-sft-dpo-ed2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_YouKnowMee__Mistral-7b-instruct-v0.2-summ-sft-dpo-ed2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T17:04:25.517599](https://huggingface.co/datasets/open-llm-leaderboard/details_YouKnowMee__Mistral-7b-instruct-v0.2-summ-sft-dpo-ed2/blob/main/results_2024-01-23T17-04-25.517599.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6490724115520589,
"acc_stderr": 0.032259594360133925,
"acc_norm": 0.648421745854093,
"acc_norm_stderr": 0.03293377935819422,
"mc1": 0.602203182374541,
"mc1_stderr": 0.01713393424855968,
"mc2": 0.7273114161495677,
"mc2_stderr": 0.014814015917833025
},
"harness|arc:challenge|25": {
"acc": 0.7192832764505119,
"acc_stderr": 0.01313123812697558,
"acc_norm": 0.7406143344709898,
"acc_norm_stderr": 0.012808273573927106
},
"harness|hellaswag|10": {
"acc": 0.7392949611631149,
"acc_stderr": 0.0043812204096411725,
"acc_norm": 0.8924517028480382,
"acc_norm_stderr": 0.003091759094519539
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337135,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337135
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268542,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268542
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603491,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603491
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.01584825580650155,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.01584825580650155
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.036412970813137296,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.036412970813137296
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.038498560987940904,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.038498560987940904
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.03192193448934724,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.03192193448934724
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993459,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993459
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.02425790170532338,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.02425790170532338
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43687150837988825,
"acc_stderr": 0.016588680864530626,
"acc_norm": 0.43687150837988825,
"acc_norm_stderr": 0.016588680864530626
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.01274197433389723,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.01274197433389723
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.019162418588623553,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.019162418588623553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.602203182374541,
"mc1_stderr": 0.01713393424855968,
"mc2": 0.7273114161495677,
"mc2_stderr": 0.014814015917833025
},
"harness|winogrande|5": {
"acc": 0.8468823993685872,
"acc_stderr": 0.010120623252272982
},
"harness|gsm8k|5": {
"acc": 0.6641394996209249,
"acc_stderr": 0.013009224714267362
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
AdapterOcean/GPTeacher_roleplay_standardized_cluster_2_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 477385
num_examples: 1614
download_size: 256809
dataset_size: 477385
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "GPTeacher_roleplay_standardized_cluster_2_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
RUXHIR2828/DATASET | ---
license: openrail
---
|
pytc/EM30 | ---
license: mit
task_categories:
- image-segmentation
language:
- en
pretty_name: em30
size_categories:
- 1B<n<10B
--- |
TigerResearch/tigerbot-wiki-plugin | ---
license: apache-2.0
language:
- en
---
[Tigerbot](https://github.com/TigerResearch/TigerBot) 模型rethink时使用的外脑原始数据,百科类
<p align="center" width="40%">
## Usage
```python
import datasets
ds_sft = datasets.load_dataset('TigerResearch/tigerbot-wiki-plugin')
``` |
smartdata | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- de
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- token-classification
task_ids:
- named-entity-recognition
pretty_name: SmartData
dataset_info:
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-DATE
'2': I-DATE
'3': B-DISASTER_TYPE
'4': I-DISASTER_TYPE
'5': B-DISTANCE
'6': I-DISTANCE
'7': B-DURATION
'8': I-DURATION
'9': B-LOCATION
'10': I-LOCATION
'11': B-LOCATION_CITY
'12': I-LOCATION_CITY
'13': B-LOCATION_ROUTE
'14': I-LOCATION_ROUTE
'15': B-LOCATION_STOP
'16': I-LOCATION_STOP
'17': B-LOCATION_STREET
'18': I-LOCATION_STREET
'19': B-NUMBER
'20': I-NUMBER
'21': B-ORGANIZATION
'22': I-ORGANIZATION
'23': B-ORGANIZATION_COMPANY
'24': I-ORGANIZATION_COMPANY
'25': B-ORG_POSITION
'26': I-ORG_POSITION
'27': B-PERSON
'28': I-PERSON
'29': B-TIME
'30': I-TIME
'31': B-TRIGGER
'32': I-TRIGGER
config_name: smartdata-v3_20200302
splits:
- name: train
num_bytes: 2124312
num_examples: 1861
- name: test
num_bytes: 266529
num_examples: 230
- name: validation
num_bytes: 258681
num_examples: 228
download_size: 18880782
dataset_size: 2649522
---
# Dataset Card for SmartData
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.dfki.de/web/forschung/projekte-publikationen/publikationen-uebersicht/publikation/9427/
- **Repository:** https://github.com/DFKI-NLP/smartdata-corpus
- **Paper:** https://www.dfki.de/fileadmin/user_upload/import/9427_lrec_smartdata_corpus.pdf
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
DFKI SmartData Corpus is a dataset of 2598 German-language documents
which has been annotated with fine-grained geo-entities, such as streets,
stops and routes, as well as standard named entity types. It has also
been annotated with a set of 15 traffic- and industry-related n-ary
relations and events, such as Accidents, Traffic jams, Acquisitions,
and Strikes. The corpus consists of newswire texts, Twitter messages,
and traffic reports from radio stations, police and railway companies.
It allows for training and evaluating both named entity recognition
algorithms that aim for fine-grained typing of geo-entities, as well
as n-ary relation extraction systems.
### Supported Tasks and Leaderboards
NER
### Languages
German
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
- id: an identifier for the article the text came from
- tokens: a list of string tokens for the text of the article
- ner_tags: a corresponding list of NER tags in the BIO format
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
CC-BY 4.0
### Citation Information
```
@InProceedings{SCHIERSCH18.85,
author = {Martin Schiersch and Veselina Mironova and Maximilian Schmitt and Philippe Thomas and Aleksandra Gabryszak and Leonhard Hennig},
title = "{A German Corpus for Fine-Grained Named Entity Recognition and Relation Extraction of Traffic and Industry Events}",
booktitle = {Proceedings of the Eleventh International Conference on Language Resources and Evaluation (LREC 2018)},
year = {2018},
month = {May 7-12, 2018},
address = {Miyazaki, Japan},
editor = {Nicoletta Calzolari (Conference chair) and Khalid Choukri and Christopher Cieri and Thierry Declerck and Sara Goggi and Koiti Hasida and Hitoshi Isahara and Bente Maegaard and Joseph Mariani and Hélène Mazo and Asuncion Moreno and Jan Odijk and Stelios Piperidis and Takenobu Tokunaga},
publisher = {European Language Resources Association (ELRA)},
isbn = {979-10-95546-00-9},
language = {english}
}
```
### Contributions
Thanks to [@aseifert](https://github.com/aseifert) for adding this dataset. |
nuprl/manytypes4py | ---
dataset_info:
features:
- name: zip
dtype: string
- name: filename
dtype: string
- name: contents
dtype: string
- name: type_annotations
sequence: string
- name: type_annotation_starts
sequence: int64
- name: type_annotation_ends
sequence: int64
splits:
- name: train
num_bytes: 4206116750
num_examples: 548536
download_size: 1334224020
dataset_size: 4206116750
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: openrail
pretty_name: ManyTypes4Py Reconstruction
---
# ManyTypes4Py-Reconstructed
This is a reconstruction of the original code from the [ManyTypes4Py paper]
from the following paper
A. M. Mir, E. Latoškinas and G. Gousios, "ManyTypes4Py: A Benchmark Python
Dataset for Machine Learning-based Type Inference," *IEEE/ACM International
Conference on Mining Software Repositories (MSR)*, 2021, pp. 585-589
[The artifact] (v0.7) for ManyTypes4Py does not have the original Python files.
Instead, each file is pre-processed into a stream of types without comments,
and the contents of each repository are stored in a single JSON file.
This reconstructed dataset has raw Python code.
More specifically:
1. We extract the list of repositories from the "clean" subset of ManyTypes4Py,
which are the repositories that type-check with *mypy*.
2. We attempt to download all repositories, but only succeed in fetching
4,663 (out of ~5.2K).
3. We augment each file with the text of each type annotation, as well as their
start and end positions (in bytes) in the code.
## Internal Note
The dataset construction code is on the Discovery cluster at `/work/arjunguha-research-group/arjun/projects/ManyTypesForPy_reconstruction`.
[ManyTypes4Py paper]: https://arxiv.org/abs/2104.04706
[The artifact]: https://zenodo.org/records/4719447 |
homersimpson/opensubtitles_it | ---
dataset_info:
features:
- name: id
dtype: string
- name: meta
struct:
- name: year
dtype: uint32
- name: imdbId
dtype: uint32
- name: subtitleId
struct:
- name: ca
dtype: uint32
- name: it
dtype: uint32
- name: sentenceIds
struct:
- name: ca
sequence: uint32
- name: it
sequence: uint32
- name: translation
dtype:
translation:
languages:
- ca
- it
splits:
- name: train
num_bytes: 28923297.6
num_examples: 240000
- name: validation
num_bytes: 3615412.2
num_examples: 30000
- name: test
num_bytes: 3615412.2
num_examples: 30000
download_size: 26056578
dataset_size: 36154122.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
loubnabnl/multiple-preds-new | ---
dataset_info:
features:
- name: predictions
sequence: string
splits:
- name: train
num_bytes: 21710012
num_examples: 161
download_size: 3267344
dataset_size: 21710012
---
# Dataset Card for "multiple-preds-new"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-636a44ed-fa98-4717-b181-b742a86b03be-4846 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- emotion
eval_info:
task: multi_class_classification
model: autoevaluate/multi-class-classification
metrics: ['matthews_correlation']
dataset_name: emotion
dataset_config: default
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: autoevaluate/multi-class-classification
* Dataset: emotion
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
jecht/cognitive_bias | ---
license: cc-by-2.0
---
|
Dizex/FoodBase | ---
dataset_info:
features:
- name: nltk_tokens
sequence: string
- name: iob_tags
sequence: string
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 2040036
num_examples: 600
- name: val
num_bytes: 662190
num_examples: 200
download_size: 353747
dataset_size: 2702226
---
# Dataset Card for "FoodBase"
Dataset for FoodBase corpus introduced in [this paper](https://academic.oup.com/database/article/doi/10.1093/database/baz121/5611291).
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ebullioscopic/Raw-Web-Scraped-to-JSON | ---
license: apache-2.0
---
|
kopyl/sample-laion-dataset-different-ratio | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 9092337.0
num_examples: 100
download_size: 9043719
dataset_size: 9092337.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AiBototicus/autotrain-data-colors-1 | ---
task_categories:
- image-classification
---
# AutoTrain Dataset for project: colors-1
## Dataset Description
This dataset has been automatically processed by AutoTrain for project colors-1.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<1920x1080 P PIL image>",
"target": 1
},
{
"image": "<1000x563 RGB PIL image>",
"target": 1
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(names=['blue', 'green', 'red'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 10 |
| valid | 3 |
|
CVasNLPExperiments/FGVC_Aircraft_test_google_flan_t5_xl_mode_T_A_ns_3333 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_ViT_L_14_with_openai_classes_Attributes_ViT_L_14_text_davinci_003_full_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 1925739
num_examples: 3333
download_size: 337848
dataset_size: 1925739
---
# Dataset Card for "FGVC_Aircraft_test_google_flan_t5_xl_mode_T_A_ns_3333"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
polytechXhf/jojos-dataset | ---
annotations_creators:
- no-annotation
language:
- en
language_creators:
- found
license:
- cc-by-3.0
multilinguality:
- monolingual
pretty_name: 'Dataset for Jojo’s x One Piece Fine Tune Stable Diffusion (FSTD) '
size_categories:
- n<1K
source_datasets:
- original
tags:
- jojos-bizarre-adventures
- one-piece
- fine-tune-stable-diffusion
task_categories:
- text-to-image
task_ids: []
dataset_info:
features:
- name: image
dtype: image
- name: char_name
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1388226.0
num_examples: 8
download_size: 1389896
dataset_size: 1388226.0
---
# More details will be added |
lerobot/xarm_lift_medium | ---
dataset_info:
features:
- name: observation.image
sequence:
sequence:
sequence: uint8
- name: observation.state
sequence: float32
- name: action
sequence: float32
- name: episode_id
dtype: int32
- name: frame_id
dtype: int64
- name: timestamp
dtype: float32
- name: next.reward
dtype: float32
- name: next.done
dtype: bool
- name: index
dtype: int64
- name: episode_data_id_from
dtype: int64
- name: episode_data_id_to
dtype: int64
splits:
- name: train
num_bytes: 445522500
num_examples: 20000
download_size: 66451301
dataset_size: 445522500
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Back-up/qa-no-answer | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
- name: is_impossible
dtype: bool
splits:
- name: train
num_bytes: 11754158.929191412
num_examples: 9217
download_size: 2677376
dataset_size: 11754158.929191412
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "qa-no-answer"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_jzjiao__opt-1.3b-rlhf | ---
pretty_name: Evaluation run of jzjiao/opt-1.3b-rlhf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jzjiao/opt-1.3b-rlhf](https://huggingface.co/jzjiao/opt-1.3b-rlhf) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jzjiao__opt-1.3b-rlhf\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-03T18:51:18.497076](https://huggingface.co/datasets/open-llm-leaderboard/details_jzjiao__opt-1.3b-rlhf/blob/main/results_2023-12-03T18-51-18.497076.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.004548900682335102,\n\
\ \"acc_stderr\": 0.0018535550440036204\n },\n \"harness|gsm8k|5\"\
: {\n \"acc\": 0.004548900682335102,\n \"acc_stderr\": 0.0018535550440036204\n\
\ }\n}\n```"
repo_url: https://huggingface.co/jzjiao/opt-1.3b-rlhf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|arc:challenge|25_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_19T11_35_35.582368
path:
- '**/details_harness|drop|3_2023-10-19T11-35-35.582368.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-19T11-35-35.582368.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_19T11_35_35.582368
path:
- '**/details_harness|gsm8k|5_2023-10-19T11-35-35.582368.parquet'
- split: 2023_12_03T18_51_18.497076
path:
- '**/details_harness|gsm8k|5_2023-12-03T18-51-18.497076.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-03T18-51-18.497076.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hellaswag|10_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:36:48.435460.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T14:36:48.435460.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T14:36:48.435460.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_19T11_35_35.582368
path:
- '**/details_harness|winogrande|5_2023-10-19T11-35-35.582368.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-19T11-35-35.582368.parquet'
- config_name: results
data_files:
- split: 2023_07_19T14_36_48.435460
path:
- results_2023-07-19T14:36:48.435460.parquet
- split: 2023_10_19T11_35_35.582368
path:
- results_2023-10-19T11-35-35.582368.parquet
- split: 2023_12_03T18_51_18.497076
path:
- results_2023-12-03T18-51-18.497076.parquet
- split: latest
path:
- results_2023-12-03T18-51-18.497076.parquet
---
# Dataset Card for Evaluation run of jzjiao/opt-1.3b-rlhf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jzjiao/opt-1.3b-rlhf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jzjiao/opt-1.3b-rlhf](https://huggingface.co/jzjiao/opt-1.3b-rlhf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jzjiao__opt-1.3b-rlhf",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-03T18:51:18.497076](https://huggingface.co/datasets/open-llm-leaderboard/details_jzjiao__opt-1.3b-rlhf/blob/main/results_2023-12-03T18-51-18.497076.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.004548900682335102,
"acc_stderr": 0.0018535550440036204
},
"harness|gsm8k|5": {
"acc": 0.004548900682335102,
"acc_stderr": 0.0018535550440036204
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Maeda-miyazaki/dataset_information | ---
license: cc-by-3.0
---
|
zolak/twitter_dataset_80_1713070287 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2789070
num_examples: 6894
download_size: 1384560
dataset_size: 2789070
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
christinacdl/Multilingual_Clickbait_Dataset | ---
license: apache-2.0
---
|
JaydenX/IGCIG | ---
task_categories:
- text-classification
- table-question-answering
- conversational
- summarization
- text-generation
- automatic-speech-recognition
- question-answering
language:
- en
tags:
- code
pretty_name: igcig_create
size_categories:
- 1K<n<10K
--- |
jayshah5696/alpaca-small-gujarati |
---
license: cc-by-nc-4.0
---
Original data source - [https://huggingface.co/datasets/tatsu-lab/alpaca](https://huggingface.co/datasets/tatsu-lab/alpaca)
Used Google Translate API to translate the dataset into Gujarati.
### Data Instances
An example of "train" looks as follows:
```json
{
"instruction": "Identify the odd one out.",
"input": "Twitter, Instagram, Telegram",
"output": "Telegram",
"text": "Below is an instruction that describes a task...",
"gujarati_instruction": "વિષમને ઓળખો.",
"gujarati_input": "ટ્વિટર, ઇન્સ્ટાગ્રામ, ટેલિગ્રામ",
"gujarati_output": "ટેલિગ્રામ"
}
```
### Data Fields
The data fields are as follows:
* `instruction`: describes the task the model should perform. Each of the 52K instructions is unique.
* `input`: optional context or input for the task. For example, when the instruction is "Summarize the following article", the input is the article. Around 40% of the examples have an input.
* `output`: the answer to the instruction as generated by `text-davinci-003`.
* `text`: the `instruction`, `input` and `output` formatted with the [prompt template](https://github.com/tatsu-lab/stanford_alpaca#data-release) used by the authors for fine-tuning their models.
* `gujarati_instruction`: Gujarati translation of the instruction
* `gujarati_input`: Gujarati translation of the input
* `gujarati_output`: Gujarati translation of the output
### Data Splits
| | train |
|---------------|------:|
| alpaca | 88 |
|
pmualex/RAG_LLM | ---
license: llama2
---
|
ragnartd/lucas | ---
license: openrail
---
|
arieg/bw_spec_cls_4_00_noise_200 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '10'
'1': '140'
'2': '2'
'3': '5'
splits:
- name: train
num_bytes: 44730986.0
num_examples: 800
- name: test
num_bytes: 1122375.0
num_examples: 20
download_size: 24737574
dataset_size: 45853361.0
---
# Dataset Card for "bw_spec_cls_4_00_noise_200"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HenriVocals/vozlair | ---
license: openrail
---
|
pequeno3d/juliochaves | ---
license: openrail
---
|
ZiAngGu/scannet_box3d | ---
dataset_info:
features:
- name: image
dtype: image
- name: conditioning_image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 2712905217.551
num_examples: 33421
download_size: 2587564254
dataset_size: 2712905217.551
---
# Dataset Card for "scannet_box"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KETI-AIR/aihub_book_summarization | ---
license: apache-2.0
---
|
hanhainebula/Embedding-Benchmark-Data | ---
license: mit
---
|
CyberHarem/araragi_pokemon | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of araragi/アララギ博士 (Pokémon)
This is the dataset of araragi/アララギ博士 (Pokémon), containing 286 images and their tags.
The core tags of this character are `breasts, earrings, green_eyes, brown_hair, large_breasts, short_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 286 | 193.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/araragi_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 286 | 131.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/araragi_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 574 | 245.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/araragi_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 286 | 177.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/araragi_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 574 | 311.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/araragi_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/araragi_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 18 |  |  |  |  |  | 1girl, jewelry, labcoat, solo, smile, cleavage, green_skirt, mature_female, pencil_skirt |
| 1 | 12 |  |  |  |  |  | 1girl, jewelry, poke_ball_(basic), smile, solo, holding_poke_ball, cleavage, looking_at_viewer, blonde_hair, blush, labcoat |
| 2 | 21 |  |  |  |  |  | 1girl, solo, jewelry, nipples, smile, pussy, female_pubic_hair, nude, looking_at_viewer, navel, blush, mature_female, simple_background |
| 3 | 6 |  |  |  |  |  | 1girl, hetero, nipples, sex, solo_focus, vaginal, 1boy, cowgirl_position, girl_on_top, jewelry, nude, open_mouth, blonde_hair, blush, cum_in_pussy, penis, uncensored |
| 4 | 10 |  |  |  |  |  | 1girl, solo_focus, 1boy, blush, hetero, jewelry, nipples, penis, shirt_lift, censored, huge_breasts, fellatio |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | jewelry | labcoat | solo | smile | cleavage | green_skirt | mature_female | pencil_skirt | poke_ball_(basic) | holding_poke_ball | looking_at_viewer | blonde_hair | blush | nipples | pussy | female_pubic_hair | nude | navel | simple_background | hetero | sex | solo_focus | vaginal | 1boy | cowgirl_position | girl_on_top | open_mouth | cum_in_pussy | penis | uncensored | shirt_lift | censored | huge_breasts | fellatio |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------|:----------|:-------|:--------|:-----------|:--------------|:----------------|:---------------|:--------------------|:--------------------|:--------------------|:--------------|:--------|:----------|:--------|:--------------------|:-------|:--------|:--------------------|:---------|:------|:-------------|:----------|:-------|:-------------------|:--------------|:-------------|:---------------|:--------|:-------------|:-------------|:-----------|:---------------|:-----------|
| 0 | 18 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 12 |  |  |  |  |  | X | X | X | X | X | X | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 2 | 21 |  |  |  |  |  | X | X | | X | X | | | X | | | | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | | | | | | | | | | | X | X | X | | | X | | | X | X | X | X | X | X | X | X | X | X | X | | | | |
| 4 | 10 |  |  |  |  |  | X | X | | | | | | | | | | | | X | X | | | | | | X | | X | | X | | | | | X | | X | X | X | X |
|
progs2002/star-trek-tng-scripts | ---
license: mit
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7768730
num_examples: 156
- name: test
num_bytes: 850567
num_examples: 18
download_size: 5081082
dataset_size: 8619297
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
The raw data has been curated from <https://www.st-minutiae.com/resources/scripts/#thenextgeneration>
<br>
This has been cleaned using the code found in <https://github.com/progs2002/StarTrekTNG-ScriptGenerator/blob/master/process_data.py>
<br>
The text files have been assigned to training and testing in an 80-20 split. |
Salama1429/tarteel-ai-everyayah-Quran | ---
pretty_name: Tarteel AI - EveryAyah Dataset
dataset_info:
features:
- name: audio
dtype: audio
- name: duration
dtype: float64
- name: text
dtype: string
- name: reciter
dtype: string
splits:
- name: train
num_bytes: 262627688145.3
num_examples: 187785
- name: test
num_bytes: 25156009734.72
num_examples: 23473
- name: validation
num_bytes: 23426886730.218
num_examples: 23474
download_size: 117190597305
dataset_size: 311210584610.23804
annotations_creators:
- expert-generated
language_creators:
- crowdsourced
language:
- ar
license:
- mit
multilinguality:
- monolingual
paperswithcode_id: tarteel-everyayah
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- automatic-speech-recognition
task_ids: []
train-eval-index:
- config: clean
task: automatic-speech-recognition
task_id: speech_recognition
splits:
train_split: train
eval_split: test
validation_split: validation
col_mapping:
audio: audio
text: text
reciter: text
metrics:
- type: wer
name: WER
- type: cer
name: CER
---
﷽
# Dataset Card for Tarteel AI's EveryAyah Dataset
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Tarteel AI](https://www.tarteel.ai/)
- **Repository:** [Needs More Information]
- **Point of Contact:** [Mohamed Saad Ibn Seddik](mailto:ms.ibnseddik@tarteel.ai)
### Dataset Summary
This dataset is a collection of Quranic verses and their transcriptions, with diacritization, by different reciters.
### Supported Tasks and Leaderboards
[Needs More Information]
### Languages
The audio is in Arabic.
## Dataset Structure
### Data Instances
A typical data point comprises the audio file `audio`, and its transcription called `text`.
The `duration` is in seconds, and the author is `reciter`.
An example from the dataset is:
```
{
'audio': {
'path': None,
'array': array([ 0. , 0. , 0. , ..., -0.00057983,
-0.00085449, -0.00061035]),
'sampling_rate': 16000
},
'duration': 6.478375,
'text': 'بِسْمِ اللَّهِ الرَّحْمَنِ الرَّحِيمِ',
'reciter': 'abdulsamad'
}
```
### Length:
Training:
Total duration: 2985111.2642479446 seconds
Total duration: 49751.85440413241 minutes
Total duration: 829.1975734022068 hours
Validation:
Total duration: 372720.43139099434 seconds
Total duration: 6212.007189849905 minutes
Total duration: 103.5334531641651 hours
Test:
Total duration: 375509.96909399604 seconds
Total duration: 6258.499484899934 minutes
Total duration: 104.30832474833224 hours
### Data Fields
- audio: A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, *i.e.* `dataset[0]["audio"]` should **always** be preferred over `dataset["audio"][0]`.
- text: The transcription of the audio file.
- duration: The duration of the audio file.
- reciter: The reciter of the verses.
### Data Splits
| | Train | Test | Validation |
| ----- | ----- | ---- | ---------- |
| dataset | 187785 | 23473 | 23474 |
### reciters
- reciters_count: 36
- reciters: {'abdul_basit',
'abdullah_basfar',
'abdullah_matroud',
'abdulsamad',
'abdurrahmaan_as-sudais',
'abu_bakr_ash-shaatree',
'ahmed_ibn_ali_al_ajamy',
'ahmed_neana',
'akram_alalaqimy',
'alafasy',
'ali_hajjaj_alsuesy',
'aziz_alili',
'fares_abbad',
'ghamadi',
'hani_rifai',
'husary',
'karim_mansoori',
'khaalid_abdullaah_al-qahtaanee',
'khalefa_al_tunaiji',
'maher_al_muaiqly',
'mahmoud_ali_al_banna',
'menshawi',
'minshawi',
'mohammad_al_tablaway',
'muhammad_abdulkareem',
'muhammad_ayyoub',
'muhammad_jibreel',
'muhsin_al_qasim',
'mustafa_ismail',
'nasser_alqatami',
'parhizgar',
'sahl_yassin',
'salaah_abdulrahman_bukhatir',
'saood_ash-shuraym',
'yaser_salamah',
'yasser_ad-dussary'}
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
### Licensing Information
[CC BY 4.0](https://creativecommons.org/licenses/by/4.0/)
### Citation Information
```
```
### Contributions
This dataset was created by:
|
smangrul/hindi_instruct_v1 | ---
language:
- en
- hi
license: mit
size_categories:
- 10K<n<100K
task_categories:
- text-generation
dataset_info:
features:
- name: category
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 32114704.0
num_examples: 20215
- name: test
num_bytes: 3408295.0
num_examples: 7788
download_size: 14542561
dataset_size: 35522999.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Hindi Instruct V1 Dataset
This dataset is curated by Sourab Mangrulkar. It was developed on top of `HuggingFaceH4/no_robots` dataset.
First, the dataset was translated using `ai4bharat/indictrans2-en-indic-1B` SoTA translation model developed by AI4Bharat.
Here, it is important to note that the sequence length limit is 256 for input and output sequences.
Hence, I split the individual sentences on full stop and create minibatches for translation and then stitch them back properly.
```python
def chunk_and_translate(content):
minibatch = content.split(".")
minibatch = [k for k in minibatch if len(k.strip())>0]
translations = batch_translate(minibatch, "eng_Latn", "hin_Deva", en_indic_model, en_indic_tokenizer)
translated_content = " ".join(translations)
return translated_content
def preprocess(sample):
for turn in sample["messages"]:
turn["content"] = chunk_and_translate(turn["content"])
return {"messages": sample["messages"]}
dataset = load_dataset("HuggingFaceH4/no_robots")
dataset
dataset = dataset.map(
preprocess,
batched=False
)
```
Next, inspired by the demo given by the Openhathi team [here](https://youtu.be/WKfVzJSDAd8), I did the following:
## Remove Coding samples
After manually going through the coding samples and trying things like having user messages in Hindi while the assistant messages with code snippets in English,
I found that it would still not work because users often give code snippets and prompt the assistant to help with debugging.
As this would be noise in Hindi instruction tuning, I simply removed all the coding examples.
## Add English and Hinglish instructions to tell the model to reply in Hindi for Engligh user messages and add the CoT prompt `Frist write in English and then translate to Hindi`
With probability of 0.2, add the prompts `Hindi mein jawab dena.`/`Reply in Hindi.`/`Summarize in Hindi` to the end of user messages in English with the assistant replies in Hindi.
This would help when user writes something in English and prompt the model to reply in Hindi.
For Chain of Thoughts ability, the OpenHathi team used the prompt "Frist write in English and then translate to Hindi" which leads to better responses.
To enable this, we suffix the prompt `Frist write in English and then translate to Hindi`/`Pehle English mein likhna, phir Hindi mein translate karna.` to user messages in English and
processing the assistant message in the template `{english_reply}}\nTranslation of the above into Hindi:\n{translated_hindi_reply}`.
```python
hindi_reply_prompt = "Hindi mein jawab dena."
english_reply_prompt = "{prefix} in Hindi."
cot_prompt = "Frist write in English and then translate to Hindi"
cot_hindi_prompt = "Pehle English mein likhna, phir Hindi mein translate karna."
def add_system_prompt(sample):
p = 0.2
language_p = 0.5
suffix_hindi_prompt = bernoulli_sample(p)
suffix_cot_prompt = bernoulli_sample(p)
use_hindi_reply_prompt = bernoulli_sample(language_p)
use_hindi_cot_prompt = bernoulli_sample(language_p)
for i, turn in enumerate(sample["hindi_messages"]):
if suffix_hindi_prompt:
if use_hindi_reply_prompt and i%2==0:
turn["content"] = f"{sample['messages'][i]['content']} {hindi_reply_prompt}"
elif i%2==0:
prefix = "Summarize" if sample["category"] == "Summarize" else "Reply"
turn["content"] = f"{sample['messages'][i]['content']} {english_reply_prompt.format(prefix=prefix)}"
elif suffix_cot_prompt:
if use_hindi_cot_prompt and i%2==0:
turn["content"] = f"{sample['messages'][i]['content']} {cot_hindi_prompt}"
elif i%2==0:
turn["content"] = f"{sample['messages'][i]['content']} {cot_prompt}"
else:
turn["content"] = f"{sample['messages'][i]['content']}\nTranslation of the above into Hindi:\n{turn['content']}"
return {"hindi_messages": sample["hindi_messages"]}
orig_dataset = orig_dataset.map(
add_system_prompt,
batched=False
)
orig_dataset
```
Example outputs:
```bash
[{'content': 'Write a funny, short story about someone who will stop at nothing to go green and reduce their carbon footprint. \n Pehle English mein likhna, phir Hindi mein translate karna.', 'role': 'user'}, {'content': 'Garry has a real green thumb, and taking care of the environment is extremely important to him. He takes every possible step to cut his carbon footprint, and it inspires and mystifies everyone around him in equal measure. Each night before he goes to bed, he cuts off the power to his house so that his electronic devices don’t draw idle power. \n\nAlmost all of his appliances are quite old, as Garry wants to get as much use as possible out of them before contributing to new pollution-generating production. The only exceptions are his new, water-efficient dishwasher and his solar-powered hot water tank. \n\nThe Homeowner’s Association hates Garry because he grows corn on his front lawn. All of his food is locally sourced as much as possible, and he commutes to work on a bike that doesn’t look like it should work. He built it himself out of recycled parts, and it seems to jump and rock like Fred Flinstone’s car. Yet, it stays in one piece and carries him faithfully to work without contributing to pollution. All in all, there’s nothing Garry won’t do to keep his carbon footprint small. \nTranslation of the above into Hindi:\nगैरी का अंगूठा असली में हरा है और पर्यावरण की देखभाल करना उसके लिए बेहद महत्वपूर्ण है। वह अपने कार्बन पदचिह्न को कम करने के लिए हर संभव कदम उठाता है, और यह उसके आसपास सभी लोगों को समान रूप से प्रेरित करता है और रहस्यमयी बनाता है। हर रात सोने से पहले वह अपने घर की बिजली काट देता है ताकि उसके इलेक्ट्रॉनिक उपकरण बेकार बिजली न खींच सकें। उनके लगभग सभी उपकरण काफी पुराने हैं, क्योंकि गैरी नए प्रदूषण पैदा करने वाले उत्पादन में योगदान देने से पहले उनका अधिक से अधिक उपयोग करना चाहते हैं। केवल अपवाद उनके नए, पानी-कुशल डिशवॉशर और उनकी सौर ऊर्जा से चलने वाली गर्म पानी की टंकी हैं। मकान मालिक संघ गैरी से नफरत करता है क्योंकि वह अपने सामने के लॉन में मकई उगाता है। उनका सारा खाना जितना संभव हो सके स्थानीय रूप से प्राप्त किया जाता है, और वह एक ऐसी बाइक पर काम करने के लिए आता है जो ऐसा नहीं लगता कि यह काम करना चाहिए। उन्होंने इसे पुनर्नवीनीकरण भागों से खुद बनाया था, और यह फ्रेड फ्लिनस्टोन की कार की तरह कूदता हुआ प्रतीत होता है। फिर भी, यह एक टुकड़े में रहता है और उसे प्रदूषण में योगदान दिए बिना काम करने के लिए ईमानदारी से ले जाता है। कुल मिलाकर, ऐसा कुछ भी नहीं है जो गैरी अपने कार्बन पदचिह्न को छोटा रखने के लिए न करे।', 'role': 'assistant'}]
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
[{'content': 'Help write a letter of 100 -200 words to my future self for Kyra, reflecting on her goals and aspirations. Frist write in English and then translate to Hindi', 'role': 'user'}, {'content': "Dear Future Self,\n\nI hope you're happy and proud of what you've achieved. As I write this, I'm excited to think about our goals and how far you've come. One goal was to be a machine learning engineer. I hope you've worked hard and become skilled in this field. Keep learning and innovating. Traveling was important to us. I hope you've seen different places and enjoyed the beauty of our world. Remember the memories and lessons. Starting a family mattered to us. If you have kids, treasure every moment. Be patient, loving, and grateful for your family.\n\nTake care of yourself. Rest, reflect, and cherish the time you spend with loved ones. Remember your dreams and celebrate what you've achieved. Your determination brought you here. I'm excited to see the person you've become, the impact you've made, and the love and joy in your life. Embrace opportunities and keep dreaming big.\n\nWith love,\nKyra\nTranslation of the above into Hindi:\nप्रिय भविष्य स्वयं, मुझे आशा है कि आपने जो हासिल किया है उससे आप खुश और गर्वित हैं। जब मैं यह लिख रहा हूँ, तो मैं अपने लक्ष्यों के बारे में सोचने के लिए उत्साहित हूं और सोचता हूँ कि आप कितनी दूर आ गए हैं। एक लक्ष्य मशीन लर्निंग इंजीनियर बनना था। मुझे आशा है कि आपने कड़ी मेहनत की होगी और इस क्षेत्र में कुशल बनेंगे। सीखते रहें और नवाचार करते रहें। हमारे लिए यात्रा करना महत्वपूर्ण था। मुझे आशा है कि आपने विभिन्न स्थानों को देखा होगा और हमारी दुनिया की सुंदरता का आनंद लिया होगा। यादों और सबक को याद रखें। परिवार शुरू करना हमारे लिए मायने रखता था। अगर आपके बच्चे हैं, तो हर पल को संजो कर रखें। अपने परिवार के लिए धैर्य, प्यार और आभारी रहें। अपना ख्याल रखें। आराम करें, सोचें और अपने प्रियजनों के साथ बिताए समय को संजो कर रखें। अपने सपनों को याद रखें और जो आपने हासिल किया है उसका जश्न मनाएं। आपका दृढ़ संकल्प आपको यहाँ ले आया है। आप जिस व्यक्ति के रूप में उभरे हैं, आपने जो प्रभाव डाला है और आपके जीवन में प्यार और खुशी को देखकर मैं उत्साहित हूं। अवसरों को अपनाएँ और बड़े सपने देखते रहें। प्यार के साथ, कायरा।", 'role': 'assistant'}]
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
[{'content': "Answer the following questions using the passage below:\nWho was Harry Potter written by?\nWho originally published the Harry Potter series?\n\nHarry Potter is a series of seven fantasy novels written by British author J. K. Rowling. The novels chronicle the lives of a young wizard, Harry Potter, and his friends Hermione Granger and Ron Weasley, all of whom are students at Hogwarts School of Witchcraft and Wizardry. The main story arc concerns Harry's conflict with Lord Voldemort, a dark wizard who intends to become immortal, overthrow the wizard governing body known as the Ministry of Magic and subjugate all wizards and Muggles (non-magical people).\n\nThe series was originally published in English by Bloomsbury in the United Kingdom and Scholastic Press in the United States. All versions around the world are printed by Grafica Veneta in Italy.[1] A series of many genres, including fantasy, drama, coming-of-age fiction, and the British school story (which includes elements of mystery, thriller, adventure, horror, and romance), the world of Harry Potter explores numerous themes and includes many cultural meanings and references.[2] According to Rowling, the main theme is death.[3] Other major themes in the series include prejudice, corruption, and madness.[4] Hindi mein jawab dena.", 'role': 'user'}, {'content': 'हैरी पॉटर ब्रिटिश लेखक जे. द्वारा लिखा गया था। के. रालिंग यह श्रृंखला मूल रूप से ब्रिटेन में ब्लूम्सबरी और अमेरिका में स्कॉलास्टिक प्रेस द्वारा प्रकाशित की गई थी।', 'role': 'assistant'}]
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
[{'content': 'Write a summary of this article about Euripides by reducing it to less than 90 words. Change the dating system from B.C. to BCE.\n\nThe Ancient Greeks, particularly the Ancient Athenians, are given the credit for inventing theater itself. As a result of that, theater became part of the culture, as well as everything that went along with it. There are theaters located throughout Ancient Greece. Also as a result of this fact, playwrights began to spring up all over Greece. One of the most well known of these is Euripides, who is especially noted for his tragedies. He is also one of the only playwrights whose work has actually survived. Here is some more information about this famous playwright from Ancient Greece:\n\nInformation About Euripides\n\nEuripides as born in 480 B.C. in Salamis, which is an island in the Saronic Gulf in Greece. It is interesting to note that he was born on the same day as the Battle of Salamis, where the ultimate victory of the Greeks prevented an invasion by the Persian Empire. He died in 406 B.C. in the city-state of Macedonia. His mother’s name was Cleito and his father’s name was Mnesarchus. His father insisted that he should be an athlete. Although his parents valued athletics, however, he wasn’t restricted in his education. He also learned artistic pursuits, such as painting, as well as philosophy from masters such as Anaxagorus. He was also a dancer and torch bearer at the rites of Apollo Zosterius. Later on, it was clear that he had a knack for the stage and he eventually began writing the plays he was most famous for.\n\n\nHis personal life, however, wasn’t as successful as his professional one. He was actually married twice and in both instances, his wives weren’t faithful to him. He also had three sons from one of these marriages. After this happened, he moved to a cave in Salamis because he wanted to live alone. Referred to as the Cave of Euripides, after his death his followers started a shrine in his honor.\n\nWorks of Euripides\n\nThere is actually some debate as to how many plays there are that were written by Euripides that we know of. Some sources say that there are 95, others say that there are 92. One thing is certain, however, is that his impact on theater is vast and his plays are still being performed today. Of these plays, a much smaller number survived in its complete form. Only about eighteen or nineteen of his plays had been found in their entirety. Besides the fact that his plays have been entertaining people for centuries, his work has also influenced playwrights through the ages. In other words, his artistic legacy has survived the ages.\n\nEuripides was also a contemporary of two other famous playwrights. Sophocles’ first play deputed thirteen years before his, and Aeschylus’s masterwork, Oresteia, debuted three years before Euripides’ first play was released.\n\nAs you can see, Euripedes has had a big impact on theater not only in Ancient Greece, but in the whole world. Sure, theater was invented in Ancient Greece, but the works of people like Euripides is everlasting and they’re still being performed today. Summarize in Hindi.', 'role': 'user'}, {'content': 'प्राचीन यूनानी नाटककार यूरिपिड्स का जन्म 480 ईसा पूर्व में ग्रीक द्वीप सलामिस में हुआ था। युवावस्था में उन्होंने एथलेटिक्स, दर्शन और चित्रकला का अध्ययन किया। वह भगवान अपोलो के संस्कारों के दौरान एक मशाल वाहक और नर्तक थे। हालाँकि इस बात पर सवाल हैं कि उन्होंने कितने नाटक लिखे, उनमें से 18 या 19 आज भी पूर्ण रूप में जीवित हैं। अपनी दूसरी असफल शादी के बाद, उन्होंने सलामिस पर एक गुफा में निवास किया जिसे अब यूरिपिड्स की गुफा कहा जाता है। 406 ईसा पूर्व में उनकी मृत्यु हो गई जो अब मैसेडोनिया है।', 'role': 'assistant'}]
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
[{'content': 'Can foxes fit down rabbit burrows? Reply in Hindi.', 'role': 'user'}, {'content': 'लोमड़ी खरगोश के बिलों में फिट नहीं हो सकते हैं, लेकिन यह उन्हें कोशिश करने से नहीं रोकता है। एक लोमड़ी जितना संभव हो उतना नीचे जाएगी और फिर खरगोश को पकड़ने और उसे बाहर निकालने के प्रयास में अपने अर्ध-वापस लेने योग्य पंजों का उपयोग करके खुदाई करेगी।', 'role': 'assistant'}]
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
```
## Hinglish (Code-Mixing and Transliteration) for colloquial usage
We use the below prompts with the following datasets `HinGE` and `https://github.com/google-research-datasets/Hinglish-TOP-Dataset` for adding samples that does different combinations of translations between English, Hindi and Hinglish.
```python
english_to_hinglish_prompt_en = "Translate the following from English into Hinglish."
hingligh_to_english_prompt_en = "Translate the following from Hinglish into English."
english_to_hinglish_prompt_hn = "English se Hinglish mein translate kare."
hingligh_to_english_prompt_hn = "Hinglish se English mein translate kare."
hindi_to_hinglish_prompt_hi = "निम्नलिखित का हिंदी से हिंग्लिश में अनुवाद करें।"
hingligh_to_hindi_prompt_hi = "निम्नलिखित का हिंग्लिश से हिंदी में अनुवाद करें।"
hindi_to_hinglish_prompt_hn = "Hindi se Hinglish mein translate kare."
hingligh_to_hindi_prompt_hn = "Hinglish se Hindi mein translate kare."
def format_to_norobots(sample):
p = 0.25
language_p = 0.5
ranom_num = random.random()
use_hn_prompt = bernoulli_sample(language_p)
hindi_messages = []
if ranom_num<= 0.25:
#en_to_hn
if use_hn_prompt:
user_message = {"content":f"{english_to_hinglish_prompt_hn}\n{sample['English']}", "role": "user"}
else:
user_message = {"content":f"{english_to_hinglish_prompt_en}\n{sample['English']}", "role": "user"}
asst_message = {"content":sample["Hinglish"], "role": "assistant"}
hindi_messages.extend([user_message, asst_message])
elif ranom_num<= 0.5:
#hn_to_en
if use_hn_prompt:
user_message = {"content":f"{hingligh_to_english_prompt_hn}\n{sample['Hinglish']}", "role": "user"}
else:
user_message = {"content":f"{hingligh_to_english_prompt_en}\n{sample['Hinglish']}", "role": "user"}
asst_message = {"content":sample["English"], "role": "assistant"}
hindi_messages.extend([user_message, asst_message])
elif ranom_num<= 0.75:
#hi_to_hn
if use_hn_prompt:
user_message = {"content":f"{hindi_to_hinglish_prompt_hn}\n{sample['Hindi']}", "role": "user"}
else:
user_message = {"content":f"{hindi_to_hinglish_prompt_hi}\n{sample['Hindi']}", "role": "user"}
asst_message = {"content":sample["Hinglish"], "role": "assistant"}
hindi_messages.extend([user_message, asst_message])
else:
#hn_to_hi
if use_hn_prompt:
user_message = {"content":f"{hingligh_to_hindi_prompt_hn}\n{sample['Hinglish']}", "role": "user"}
else:
user_message = {"content":f"{hingligh_to_hindi_prompt_hi}\n{sample['Hinglish']}", "role": "user"}
asst_message = {"content":sample["Hindi"], "role": "assistant"}
hindi_messages.extend([user_message, asst_message])
return {"category": "Transliteration and Code Mixing", "hindi_messages": hindi_messages}
```
Example output:
```bash
[{'content': 'Hinglish se English mein translate kare.\nMere liye reminder set karo to wake up at 6:30 am tomorrow.',
'role': 'user'},
{'content': 'Set a reminder for me to wake up at 630 am tomorrow.',
'role': 'assistant'}]
```
## Add the Hindi specific System prompt
With 0.2 probability, we add the system prompt `You are a native Hindi speaker who can converse at expert level in both Hindi and colloquial Hinglish`.
```python
system_prompt = "You are a native Hindi speaker who can converse at expert level in both Hindi and colloquial Hinglish."
def add_system_prompt(sample):
p = 0.2
add_system_message = bernoulli_sample(p)
first_message = sample["hindi_messages"][0]
if add_system_message:
if first_message["role"] == "system":
first_message["content"] = f"{first_message['content']} {system_prompt}"
else:
system_message = {'content': system_prompt, 'role': 'system'}
sample["hindi_messages"].insert(0, system_message)
return {"hindi_messages": sample["hindi_messages"]}
```
Example output:
```bash
{'category': 'Generation',
'hindi_messages': [{'content': 'You are a native Hindi speaker who can converse at expert level in both Hindi and colloquial Hinglish.',
'role': 'system'},
{'content': 'Help write a letter of 100 -200 words to my future self for Kyra, reflecting on her goals and aspirations. Frist write in English and then translate to Hindi',
'role': 'user'},
{'content': "Dear Future Self,\n\nI hope you're happy and proud of what you've achieved. As I write this, I'm excited to think about our goals and how far you've come. One goal was to be a machine learning engineer. I hope you've worked hard and become skilled in this field. Keep learning and innovating. Traveling was important to us. I hope you've seen different places and enjoyed the beauty of our world. Remember the memories and lessons. Starting a family mattered to us. If you have kids, treasure every moment. Be patient, loving, and grateful for your family.\n\nTake care of yourself. Rest, reflect, and cherish the time you spend with loved ones. Remember your dreams and celebrate what you've achieved. Your determination brought you here. I'm excited to see the person you've become, the impact you've made, and the love and joy in your life. Embrace opportunities and keep dreaming big.\n\nWith love,\nKyra\nTranslation of the above into Hindi:\nप्रिय भविष्य स्वयं, मुझे आशा है कि आपने जो हासिल किया है उससे आप खुश और गर्वित हैं। जब मैं यह लिख रहा हूँ, तो मैं अपने लक्ष्यों के बारे में सोचने के लिए उत्साहित हूं और सोचता हूँ कि आप कितनी दूर आ गए हैं। एक लक्ष्य मशीन लर्निंग इंजीनियर बनना था। मुझे आशा है कि आपने कड़ी मेहनत की होगी और इस क्षेत्र में कुशल बनेंगे। सीखते रहें और नवाचार करते रहें। हमारे लिए यात्रा करना महत्वपूर्ण था। मुझे आशा है कि आपने विभिन्न स्थानों को देखा होगा और हमारी दुनिया की सुंदरता का आनंद लिया होगा। यादों और सबक को याद रखें। परिवार शुरू करना हमारे लिए मायने रखता था। अगर आपके बच्चे हैं, तो हर पल को संजो कर रखें। अपने परिवार के लिए धैर्य, प्यार और आभारी रहें। अपना ख्याल रखें। आराम करें, सोचें और अपने प्रियजनों के साथ बिताए समय को संजो कर रखें। अपने सपनों को याद रखें और जो आपने हासिल किया है उसका जश्न मनाएं। आपका दृढ़ संकल्प आपको यहाँ ले आया है। आप जिस व्यक्ति के रूप में उभरे हैं, आपने जो प्रभाव डाला है और आपके जीवन में प्यार और खुशी को देखकर मैं उत्साहित हूं। अवसरों को अपनाएँ और बड़े सपने देखते रहें। प्यार के साथ, कायरा।",
'role': 'assistant'}]}
``` |
philipphager/baidu-ultr-pretrain | ---
license: cc-by-nc-4.0
---
|
open-llm-leaderboard/details_luffycodes__vicuna-class-shishya-ac-hal-7b-ep3 | ---
pretty_name: Evaluation run of luffycodes/vicuna-class-shishya-ac-hal-7b-ep3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [luffycodes/vicuna-class-shishya-ac-hal-7b-ep3](https://huggingface.co/luffycodes/vicuna-class-shishya-ac-hal-7b-ep3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_luffycodes__vicuna-class-shishya-ac-hal-7b-ep3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-16T15:05:04.315196](https://huggingface.co/datasets/open-llm-leaderboard/details_luffycodes__vicuna-class-shishya-ac-hal-7b-ep3/blob/main/results_2023-12-16T15-05-04.315196.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.504457394086061,\n\
\ \"acc_stderr\": 0.0339872992064852,\n \"acc_norm\": 0.5129744502613426,\n\
\ \"acc_norm_stderr\": 0.03491403416694853,\n \"mc1\": 0.2766217870257038,\n\
\ \"mc1_stderr\": 0.015659605755326923,\n \"mc2\": 0.43032912918561517,\n\
\ \"mc2_stderr\": 0.014997775568928156\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4249146757679181,\n \"acc_stderr\": 0.014445698968520769,\n\
\ \"acc_norm\": 0.4462457337883959,\n \"acc_norm_stderr\": 0.014526705548539982\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5800637323242382,\n\
\ \"acc_stderr\": 0.004925394995490124,\n \"acc_norm\": 0.7697669786895041,\n\
\ \"acc_norm_stderr\": 0.004201215520808244\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4868421052631579,\n \"acc_stderr\": 0.04067533136309173,\n\
\ \"acc_norm\": 0.4868421052631579,\n \"acc_norm_stderr\": 0.04067533136309173\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.569811320754717,\n \"acc_stderr\": 0.030471445867183238,\n\
\ \"acc_norm\": 0.569811320754717,\n \"acc_norm_stderr\": 0.030471445867183238\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04181210050035455,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04181210050035455\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117317,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117317\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.45664739884393063,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.45664739884393063,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.03873958714149352,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.03873958714149352\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.335978835978836,\n \"acc_stderr\": 0.02432631052914915,\n \"acc_norm\"\
: 0.335978835978836,\n \"acc_norm_stderr\": 0.02432631052914915\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5612903225806452,\n \"acc_stderr\": 0.02822949732031722,\n \"\
acc_norm\": 0.5612903225806452,\n \"acc_norm_stderr\": 0.02822949732031722\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.41379310344827586,\n \"acc_stderr\": 0.03465304488406795,\n \"\
acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.03465304488406795\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6424242424242425,\n \"acc_stderr\": 0.037425970438065864,\n\
\ \"acc_norm\": 0.6424242424242425,\n \"acc_norm_stderr\": 0.037425970438065864\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6111111111111112,\n \"acc_stderr\": 0.0347327959083696,\n \"acc_norm\"\
: 0.6111111111111112,\n \"acc_norm_stderr\": 0.0347327959083696\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7564766839378239,\n \"acc_stderr\": 0.030975436386845457,\n\
\ \"acc_norm\": 0.7564766839378239,\n \"acc_norm_stderr\": 0.030975436386845457\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4846153846153846,\n \"acc_stderr\": 0.025339003010106515,\n\
\ \"acc_norm\": 0.4846153846153846,\n \"acc_norm_stderr\": 0.025339003010106515\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.02708037281514566,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.02708037281514566\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4957983193277311,\n \"acc_stderr\": 0.0324773433444811,\n \
\ \"acc_norm\": 0.4957983193277311,\n \"acc_norm_stderr\": 0.0324773433444811\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.037579499229433426,\n \"\
acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.037579499229433426\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7064220183486238,\n \"acc_stderr\": 0.019525151122639667,\n \"\
acc_norm\": 0.7064220183486238,\n \"acc_norm_stderr\": 0.019525151122639667\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.375,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.375,\n\
\ \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7107843137254902,\n \"acc_stderr\": 0.031822318676475544,\n\
\ \"acc_norm\": 0.7107843137254902,\n \"acc_norm_stderr\": 0.031822318676475544\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6919831223628692,\n \"acc_stderr\": 0.0300523893356057,\n \
\ \"acc_norm\": 0.6919831223628692,\n \"acc_norm_stderr\": 0.0300523893356057\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n\
\ \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n\
\ \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.04243869242230524,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.04243869242230524\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6033057851239669,\n \"acc_stderr\": 0.044658697805310094,\n \"\
acc_norm\": 0.6033057851239669,\n \"acc_norm_stderr\": 0.044658697805310094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.048262172941398944,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.048262172941398944\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5398773006134969,\n \"acc_stderr\": 0.03915857291436971,\n\
\ \"acc_norm\": 0.5398773006134969,\n \"acc_norm_stderr\": 0.03915857291436971\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.04656147110012349,\n\
\ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.04656147110012349\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7649572649572649,\n\
\ \"acc_stderr\": 0.027778835904935434,\n \"acc_norm\": 0.7649572649572649,\n\
\ \"acc_norm_stderr\": 0.027778835904935434\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6845466155810983,\n\
\ \"acc_stderr\": 0.016617501738763397,\n \"acc_norm\": 0.6845466155810983,\n\
\ \"acc_norm_stderr\": 0.016617501738763397\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5664739884393064,\n \"acc_stderr\": 0.026680134761679214,\n\
\ \"acc_norm\": 0.5664739884393064,\n \"acc_norm_stderr\": 0.026680134761679214\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2737430167597765,\n\
\ \"acc_stderr\": 0.014912413096372434,\n \"acc_norm\": 0.2737430167597765,\n\
\ \"acc_norm_stderr\": 0.014912413096372434\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5620915032679739,\n \"acc_stderr\": 0.02840830202033269,\n\
\ \"acc_norm\": 0.5620915032679739,\n \"acc_norm_stderr\": 0.02840830202033269\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.617363344051447,\n\
\ \"acc_stderr\": 0.027604689028581993,\n \"acc_norm\": 0.617363344051447,\n\
\ \"acc_norm_stderr\": 0.027604689028581993\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5895061728395061,\n \"acc_stderr\": 0.027371350925124768,\n\
\ \"acc_norm\": 0.5895061728395061,\n \"acc_norm_stderr\": 0.027371350925124768\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.35815602836879434,\n \"acc_stderr\": 0.028602085862759412,\n \
\ \"acc_norm\": 0.35815602836879434,\n \"acc_norm_stderr\": 0.028602085862759412\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3748370273794003,\n\
\ \"acc_stderr\": 0.012363652467551927,\n \"acc_norm\": 0.3748370273794003,\n\
\ \"acc_norm_stderr\": 0.012363652467551927\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5220588235294118,\n \"acc_stderr\": 0.03034326422421352,\n\
\ \"acc_norm\": 0.5220588235294118,\n \"acc_norm_stderr\": 0.03034326422421352\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.48856209150326796,\n \"acc_stderr\": 0.020222541515610863,\n \
\ \"acc_norm\": 0.48856209150326796,\n \"acc_norm_stderr\": 0.020222541515610863\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.04653429807913507,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.04653429807913507\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6163265306122448,\n \"acc_stderr\": 0.031130880396235926,\n\
\ \"acc_norm\": 0.6163265306122448,\n \"acc_norm_stderr\": 0.031130880396235926\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7213930348258707,\n\
\ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.7213930348258707,\n\
\ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079022,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079022\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7309941520467836,\n \"acc_stderr\": 0.034010526201040885,\n\
\ \"acc_norm\": 0.7309941520467836,\n \"acc_norm_stderr\": 0.034010526201040885\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2766217870257038,\n\
\ \"mc1_stderr\": 0.015659605755326923,\n \"mc2\": 0.43032912918561517,\n\
\ \"mc2_stderr\": 0.014997775568928156\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7174427782162589,\n \"acc_stderr\": 0.012654062850971396\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/luffycodes/vicuna-class-shishya-ac-hal-7b-ep3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|arc:challenge|25_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|gsm8k|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hellaswag|10_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T15-05-04.315196.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T15-05-04.315196.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- '**/details_harness|winogrande|5_2023-12-16T15-05-04.315196.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-16T15-05-04.315196.parquet'
- config_name: results
data_files:
- split: 2023_12_16T15_05_04.315196
path:
- results_2023-12-16T15-05-04.315196.parquet
- split: latest
path:
- results_2023-12-16T15-05-04.315196.parquet
---
# Dataset Card for Evaluation run of luffycodes/vicuna-class-shishya-ac-hal-7b-ep3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [luffycodes/vicuna-class-shishya-ac-hal-7b-ep3](https://huggingface.co/luffycodes/vicuna-class-shishya-ac-hal-7b-ep3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_luffycodes__vicuna-class-shishya-ac-hal-7b-ep3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-16T15:05:04.315196](https://huggingface.co/datasets/open-llm-leaderboard/details_luffycodes__vicuna-class-shishya-ac-hal-7b-ep3/blob/main/results_2023-12-16T15-05-04.315196.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.504457394086061,
"acc_stderr": 0.0339872992064852,
"acc_norm": 0.5129744502613426,
"acc_norm_stderr": 0.03491403416694853,
"mc1": 0.2766217870257038,
"mc1_stderr": 0.015659605755326923,
"mc2": 0.43032912918561517,
"mc2_stderr": 0.014997775568928156
},
"harness|arc:challenge|25": {
"acc": 0.4249146757679181,
"acc_stderr": 0.014445698968520769,
"acc_norm": 0.4462457337883959,
"acc_norm_stderr": 0.014526705548539982
},
"harness|hellaswag|10": {
"acc": 0.5800637323242382,
"acc_stderr": 0.004925394995490124,
"acc_norm": 0.7697669786895041,
"acc_norm_stderr": 0.004201215520808244
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4868421052631579,
"acc_stderr": 0.04067533136309173,
"acc_norm": 0.4868421052631579,
"acc_norm_stderr": 0.04067533136309173
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.569811320754717,
"acc_stderr": 0.030471445867183238,
"acc_norm": 0.569811320754717,
"acc_norm_stderr": 0.030471445867183238
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5,
"acc_stderr": 0.04181210050035455,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04181210050035455
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117317,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117317
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.45664739884393063,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.45664739884393063,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.03873958714149352,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.03873958714149352
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.335978835978836,
"acc_stderr": 0.02432631052914915,
"acc_norm": 0.335978835978836,
"acc_norm_stderr": 0.02432631052914915
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5612903225806452,
"acc_stderr": 0.02822949732031722,
"acc_norm": 0.5612903225806452,
"acc_norm_stderr": 0.02822949732031722
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.03465304488406795,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.03465304488406795
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6424242424242425,
"acc_stderr": 0.037425970438065864,
"acc_norm": 0.6424242424242425,
"acc_norm_stderr": 0.037425970438065864
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.0347327959083696,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.0347327959083696
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7564766839378239,
"acc_stderr": 0.030975436386845457,
"acc_norm": 0.7564766839378239,
"acc_norm_stderr": 0.030975436386845457
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4846153846153846,
"acc_stderr": 0.025339003010106515,
"acc_norm": 0.4846153846153846,
"acc_norm_stderr": 0.025339003010106515
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.02708037281514566,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.02708037281514566
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4957983193277311,
"acc_stderr": 0.0324773433444811,
"acc_norm": 0.4957983193277311,
"acc_norm_stderr": 0.0324773433444811
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.037579499229433426,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.037579499229433426
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7064220183486238,
"acc_stderr": 0.019525151122639667,
"acc_norm": 0.7064220183486238,
"acc_norm_stderr": 0.019525151122639667
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.375,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.375,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7107843137254902,
"acc_stderr": 0.031822318676475544,
"acc_norm": 0.7107843137254902,
"acc_norm_stderr": 0.031822318676475544
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6919831223628692,
"acc_stderr": 0.0300523893356057,
"acc_norm": 0.6919831223628692,
"acc_norm_stderr": 0.0300523893356057
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6053811659192825,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.6053811659192825,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.04243869242230524,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.04243869242230524
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6033057851239669,
"acc_stderr": 0.044658697805310094,
"acc_norm": 0.6033057851239669,
"acc_norm_stderr": 0.044658697805310094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.048262172941398944,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.048262172941398944
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5398773006134969,
"acc_stderr": 0.03915857291436971,
"acc_norm": 0.5398773006134969,
"acc_norm_stderr": 0.03915857291436971
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.04656147110012349,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.04656147110012349
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7649572649572649,
"acc_stderr": 0.027778835904935434,
"acc_norm": 0.7649572649572649,
"acc_norm_stderr": 0.027778835904935434
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6845466155810983,
"acc_stderr": 0.016617501738763397,
"acc_norm": 0.6845466155810983,
"acc_norm_stderr": 0.016617501738763397
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.026680134761679214,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.026680134761679214
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2737430167597765,
"acc_stderr": 0.014912413096372434,
"acc_norm": 0.2737430167597765,
"acc_norm_stderr": 0.014912413096372434
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5620915032679739,
"acc_stderr": 0.02840830202033269,
"acc_norm": 0.5620915032679739,
"acc_norm_stderr": 0.02840830202033269
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.617363344051447,
"acc_stderr": 0.027604689028581993,
"acc_norm": 0.617363344051447,
"acc_norm_stderr": 0.027604689028581993
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5895061728395061,
"acc_stderr": 0.027371350925124768,
"acc_norm": 0.5895061728395061,
"acc_norm_stderr": 0.027371350925124768
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35815602836879434,
"acc_stderr": 0.028602085862759412,
"acc_norm": 0.35815602836879434,
"acc_norm_stderr": 0.028602085862759412
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3748370273794003,
"acc_stderr": 0.012363652467551927,
"acc_norm": 0.3748370273794003,
"acc_norm_stderr": 0.012363652467551927
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5220588235294118,
"acc_stderr": 0.03034326422421352,
"acc_norm": 0.5220588235294118,
"acc_norm_stderr": 0.03034326422421352
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.48856209150326796,
"acc_stderr": 0.020222541515610863,
"acc_norm": 0.48856209150326796,
"acc_norm_stderr": 0.020222541515610863
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.04653429807913507,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.04653429807913507
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6163265306122448,
"acc_stderr": 0.031130880396235926,
"acc_norm": 0.6163265306122448,
"acc_norm_stderr": 0.031130880396235926
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7213930348258707,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.7213930348258707,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079022,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079022
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7309941520467836,
"acc_stderr": 0.034010526201040885,
"acc_norm": 0.7309941520467836,
"acc_norm_stderr": 0.034010526201040885
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2766217870257038,
"mc1_stderr": 0.015659605755326923,
"mc2": 0.43032912918561517,
"mc2_stderr": 0.014997775568928156
},
"harness|winogrande|5": {
"acc": 0.7174427782162589,
"acc_stderr": 0.012654062850971396
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
nguyenminh871/hazelcast_3_3_EA | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: func
dtype: string
- name: target
dtype: bool
- name: project
dtype: string
splits:
- name: hazelcast_3_3_EA
num_bytes: 8353741
num_examples: 3765
download_size: 1934896
dataset_size: 8353741
---
# Dataset Card for "hazelcast_3_3_EA"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
killah-t-cell/movie_stills_dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 2868263.0
num_examples: 10
download_size: 2869997
dataset_size: 2868263.0
---
# Dataset Card for "movie_stills_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sotiris1987/ai | ---
license: apache-2.0
---
|
rjaiswal/watches-plus-3D-views-dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 16328707.0
num_examples: 186
download_size: 16234485
dataset_size: 16328707.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "watches-plus-3D-views-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
m111styd4y/marisam | ---
license: openrail
---
|
CyberHarem/mirin_granbluefantasy | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mirin (Granblue Fantasy)
This is the dataset of mirin (Granblue Fantasy), containing 73 images and their tags.
The core tags of this character are `blonde_hair, long_hair, blue_eyes, bangs, bow, breasts, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 73 | 85.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mirin_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 73 | 53.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mirin_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 166 | 112.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mirin_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 73 | 77.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mirin_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 166 | 154.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mirin_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mirin_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------|
| 0 | 73 |  |  |  |  |  | 1girl, solo, smile, looking_at_viewer, blush, kimono, holding, open_mouth, katana, wide_sleeves, skirt, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | looking_at_viewer | blush | kimono | holding | open_mouth | katana | wide_sleeves | skirt | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------------|:--------|:---------|:----------|:-------------|:---------|:---------------|:--------|:-------------------|
| 0 | 73 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X |
|
TeetouchQQ/train_data | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: source
dtype: string
- name: raw_entities
struct:
- name: EMAIL
sequence: string
- name: ID_NUM
sequence: string
- name: NAME_STUDENT
sequence: string
- name: PHONE_NUM
sequence: string
- name: STREET_ADDRESS
sequence: string
- name: URL_PERSONAL
sequence: string
- name: USERNAME
sequence: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 54446342
num_examples: 5785
download_size: 25486709
dataset_size: 54446342
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_wnli_were_was | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 1276
num_examples: 6
- name: test
num_bytes: 5502
num_examples: 17
- name: train
num_bytes: 9838
num_examples: 47
download_size: 15482
dataset_size: 16616
---
# Dataset Card for "MULTI_VALUE_wnli_were_was"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ewof/lamini-instruct-unfiltered-deduped | ---
pretty_name: LaMini Instruct Unfiltered Deduped
size_categories:
- 1M<n<10M
---
This dataset is the LLaMini from MBZUAI/LaMini-instruction, removing instances of blatant alignment and removes duplicates.
2290278 instructions remain.
i merged the parquets from original repo with parquet2json then ran clean_format_dedupe.py on the resulting jsonl
credit to ehardford for his contains_unwanted_words function
|
imvladikon/stsb_he | ---
dataset_info:
features:
- name: label
dtype: float64
- name: idx
dtype: int64
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 3122340
num_examples: 14597
- name: validation
num_bytes: 670209
num_examples: 3489
download_size: 1879461
dataset_size: 3792549
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
task_categories:
- sentence-similarity
language:
- he
---
## Description
Machine-translated Hebrew version sts-b dataset, with additional records: `augmented` - not-matched records that were generated by weak generative model, and "chatgpt" - paraphrases that were generated by chatgpt accordingly
## Sample
```json
[{'label': 4.666999816894531, 'idx': 13, 'sentence1': 'אדם מקפל פיסת נייר.', 'sentence2': 'מישהו מקפל פיסת נייר.', 'source': 'machine-translated'},
{'label': 0.0, 'idx': 13, 'sentence1': 'אדם מקפל פיסת נייר.', 'sentence2': 'כתב מייל.', 'source': 'augmented'},
{'label': 4.0, 'idx': 13, 'sentence1': 'אדם מקפל פיסת נייר.', 'sentence2': 'אדם מכפיל ניידת נייר.', 'source': 'chatgpt'}]
``` |
vwxyzjn/openhermes-dev__mistralai_Mistral-7B-Instruct-v0.1__1707330518 | ---
dataset_info:
features:
- name: model
dtype: 'null'
- name: category
dtype: string
- name: language
dtype: string
- name: custom_instruction
dtype: bool
- name: id
dtype: string
- name: topic
dtype: string
- name: avatarUrl
dtype: 'null'
- name: idx
dtype: 'null'
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: weight
dtype: 'null'
- name: system_prompt
dtype: string
- name: source
dtype: string
- name: model_name
dtype: string
- name: skip_prompt_formatting
dtype: bool
- name: title
dtype: string
- name: hash
dtype: 'null'
- name: views
dtype: 'null'
- name: prompt
dtype: string
- name: token_length
dtype: int64
- name: candidate0
list:
- name: content
dtype: string
- name: role
dtype: string
- name: candidate1
list:
- name: content
dtype: string
- name: role
dtype: string
- name: candidate0_policy
dtype: string
- name: candidate1_policy
dtype: string
- name: candidate0_score
dtype: float64
- name: candidate1_score
dtype: float64
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: chosen_policy
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected_policy
dtype: string
splits:
- name: train_prefs
num_bytes: 2300797
num_examples: 167
download_size: 1365578
dataset_size: 2300797
configs:
- config_name: default
data_files:
- split: train_prefs
path: data/train_prefs-*
---
|
Mihaiii/OpenHermes-2.5-1k-longest-truncated | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 2569241
num_examples: 517
download_size: 1247504
dataset_size: 2569241
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
This is a dataset that was created from [HuggingFaceH4/OpenHermes-2.5-1k-longest](https://huggingface.co/datasets/HuggingFaceH4/OpenHermes-2.5-1k-longest).
The purpose is to be able to use in [axolotl](https://github.com/OpenAccess-AI-Collective/axolotl) config by adding:
```yaml
datasets:
- path: Mihaiii/OpenHermes-2.5-1k-longest-truncated
type: alpaca
```
I eliminated all "glaive-code-assist" rows + some others.
See the [OpenHermes-2.5-1k-longest-truncated.ipynb](https://huggingface.co/datasets/Mihaiii/OpenHermes-2.5-1k-longest-truncated/blob/main/OpenHermes-2.5-1k-longest-truncated.ipynb) notebook for details on how the dataset was constructed.
|
stevez/test_db | ---
license: mit
---
|
khursani8/ssss | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
- name: speaker_id
dtype: string
splits:
- name: train
num_bytes: 1837335107.872
num_examples: 1207
download_size: 1844050999
dataset_size: 1837335107.872
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bakercok123/MARCIOSEIV1 | ---
license: openrail
---
|
andersonbcdefg/dolly_reward_modeling_pairwise | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response_a
dtype: string
- name: response_b
dtype: string
- name: explanation
dtype: string
- name: preferred
dtype: string
splits:
- name: train
num_bytes: 16503157
num_examples: 19343
download_size: 9011974
dataset_size: 16503157
---
# Dataset Card for "dolly_reward_modeling_pairwise"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Milana/russian_keywords | ---
task_categories:
- summarization
language:
- ru
pretty_name: Datasets for evaluation of keyword extraction in Russian
size_categories:
- 10K<n<100K
--- |
yzhuang/autotree_automl_heloc_gosdt_l512_d3 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: int64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: int64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 11682400000
num_examples: 100000
- name: validation
num_bytes: 1168240000
num_examples: 10000
download_size: 1504688602
dataset_size: 12850640000
---
# Dataset Card for "autotree_automl_heloc_gosdt_l512_d3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pccl-org/formal-logic-simple-order-multi-token-dynamic-objects-paired-relationship-0-5000 | ---
dataset_info:
features:
- name: greater_than
sequence: int64
- name: less_than
sequence: int64
- name: paired_example
sequence:
sequence:
sequence: int64
- name: correct_example
sequence:
sequence: int64
- name: incorrect_example
sequence:
sequence: int64
- name: distance
dtype: int64
- name: index
dtype: int64
- name: index_in_distance
dtype: int64
splits:
- name: train
num_bytes: 672730080
num_examples: 2370250
download_size: 229940343
dataset_size: 672730080
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
teletubbee/bees | ---
license: cc
---
|
Lollitor/CID13 | ---
dataset_info:
config_name: Lollitor
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 10852
num_examples: 176
download_size: 3806
dataset_size: 10852
configs:
- config_name: Lollitor
data_files:
- split: train
path: Lollitor/train-*
---
# Dataset Card for "CID13"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
amasiukevich/github-issues-datasets | ---
license: apache-2.0
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: labels
list:
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: id
dtype: int64
- name: name
dtype: string
- name: node_id
dtype: string
- name: url
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: assignees
list:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: milestone
struct:
- name: closed_at
dtype: 'null'
- name: closed_issues
dtype: int64
- name: created_at
dtype: string
- name: creator
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: description
dtype: string
- name: due_on
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: labels_url
dtype: string
- name: node_id
dtype: string
- name: number
dtype: int64
- name: open_issues
dtype: int64
- name: state
dtype: string
- name: title
dtype: string
- name: updated_at
dtype: string
- name: url
dtype: string
- name: comments
sequence: string
- name: created_at
dtype: timestamp[ns, tz=UTC]
- name: updated_at
dtype: timestamp[ns, tz=UTC]
- name: closed_at
dtype: timestamp[ns, tz=UTC]
- name: author_association
dtype: string
- name: active_lock_reason
dtype: float64
- name: draft
dtype: float64
- name: pull_request
struct:
- name: diff_url
dtype: string
- name: html_url
dtype: string
- name: merged_at
dtype: string
- name: patch_url
dtype: string
- name: url
dtype: string
- name: body
dtype: string
- name: reactions
struct:
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: confused
dtype: int64
- name: eyes
dtype: int64
- name: heart
dtype: int64
- name: hooray
dtype: int64
- name: laugh
dtype: int64
- name: rocket
dtype: int64
- name: total_count
dtype: int64
- name: url
dtype: string
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: float64
- name: state_reason
dtype: string
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 27536364
num_examples: 4000
download_size: 8074913
dataset_size: 27536364
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
carnival13/nci_nq_t5_tokenized | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: eval
path: data/eval-*
- split: eval_zero_shot
path: data/eval_zero_shot-*
- split: eval_normal
path: data/eval_normal-*
dataset_info:
features:
- name: input
dtype: string
- name: label
sequence: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 137430914
num_examples: 177638
- name: eval
num_bytes: 1529607
num_examples: 7830
- name: eval_zero_shot
num_bytes: 562161
num_examples: 2859
- name: eval_normal
num_bytes: 967446
num_examples: 4971
download_size: 61636686
dataset_size: 140490128
---
# Dataset Card for "nci_nq_t5_tokenized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.