datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
jlbaker361/league-maybe-gsdf-counterfeit-50 | ---
dataset_info:
features:
- name: image
dtype: image
- name: prompt
dtype: string
- name: seed
dtype: int64
- name: steps
dtype: int64
splits:
- name: train
num_bytes: 28604472.0
num_examples: 72
download_size: 28601869
dataset_size: 28604472.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jjonhwa/V4 | ---
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
splits:
- name: train
num_bytes: 1664924673
num_examples: 542138
download_size: 194102886
dataset_size: 1664924673
---
# Dataset Card for "V4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ninsean1/twitter-mental-illness-detection | ---
task_categories:
- text-classification
language:
- en
tags:
- mental
- mental-health
- bert
- roberta
- twitter
- twitter-mental-health
- mental-illness
pretty_name: B
size_categories:
- 10K<n<100K
--- |
arubenruben/portuguese-mapa | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PESSOA
'2': I-PESSOA
'3': B-ORGANIZACAO
'4': I-ORGANIZACAO
'5': B-LOCAL
'6': I-LOCAL
'7': B-TEMPO
'8': I-TEMPO
'9': B-VALOR
'10': I-VALOR
splits:
- name: train
num_bytes: 970478
num_examples: 1086
- name: validation
num_bytes: 119282
num_examples: 105
- name: test
num_bytes: 335581
num_examples: 390
download_size: 218401
dataset_size: 1425341
---
# Dataset Card for "portuguese-mapa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/OK-VQA_test_google_flan_t5_xxl_mode_T_A_CM_D_PNP_GENERIC_Q_rices_ns_5046 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
sequence: string
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large__text
num_bytes: 58812363
num_examples: 5046
download_size: 10592031
dataset_size: 58812363
---
# Dataset Card for "OK-VQA_test_google_flan_t5_xxl_mode_T_A_CM_D_PNP_GENERIC_Q_rices_ns_5046"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Gabriel/quora_swe | ---
language:
- sv
license:
- mit
size_categories:
- 10K<n<100K
task_categories:
- text-retrieval
- text-classification
task_ids:
- semantic-similarity-classification
tags:
- question-pairing
- semantic-search
---
# Dataset Card for "quora_swe"
The dataset quora_swe is a subset of the automatically translated (MNT) Swedish Semantic Textual Similarity dataset: quora-deduplicates .
|
xunman2/illuminationdb | ---
license: gpl
---
|
TinyPixel/claude | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: id
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 17853037
num_examples: 1609
download_size: 9535294
dataset_size: 17853037
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
simonguest/sprites | ---
license: apache-2.0
---
|
kbthebest181/testfinetune | ---
dataset_info:
features:
- name: example
dtype: string
splits:
- name: train
num_bytes: 62396
num_examples: 84
- name: test
num_bytes: 6772
num_examples: 9
download_size: 18126
dataset_size: 69168
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
test |
wics/ceval | ---
license: unknown
---
|
jacobbieker/gfs-kerchunk | ---
license: mit
---
|
HumanDynamics/sft_dataset | ---
dataset_info:
features:
- name: system
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 64750059.66456871
num_examples: 30000
download_size: 30877974
dataset_size: 64750059.66456871
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "sft_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
abzzer/security-code-chatbot_pre-train | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4159264
num_examples: 1000
download_size: 2254346
dataset_size: 4159264
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
saibo/bookcorpus_compact_512_shard6_of_10 | ---
dataset_info:
features:
- name: text
dtype: string
- name: concept_with_offset
dtype: string
splits:
- name: train
num_bytes: 804636647
num_examples: 121933
download_size: 401996995
dataset_size: 804636647
---
# Dataset Card for "bookcorpus_compact_512_shard6"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/scheherazade_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of scheherazade/シェヘラザード/山鲁佐德 (Fate/Grand Order)
This is the dataset of scheherazade/シェヘラザード/山鲁佐德 (Fate/Grand Order), containing 430 images and their tags.
The core tags of this character are `long_hair, dark_skin, dark-skinned_female, breasts, black_hair, green_eyes, large_breasts, very_long_hair, hat, parted_bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 430 | 621.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scheherazade_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 430 | 539.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scheherazade_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1007 | 969.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scheherazade_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/scheherazade_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, arm_wrap, circlet, forehead_jewel, pauldrons, solo, thighs, bandaged_arm, bracelet, bridal_gauntlets, feathers, looking_at_viewer, thumb_ring, cleavage, sitting, covered_navel, facial_mark, armlet, pelvic_curtain, scroll, parted_lips, simple_background, white_background |
| 1 | 5 |  |  |  |  |  | 1girl, arm_wrap, bandaged_arm, blue_armor, bracelet, bridal_gauntlets, circlet, cleavage, covered_navel, forehead_jewel, parted_lips, pauldrons, solo, thighs, thumb_ring, feathers, looking_at_viewer, pelvic_curtain, scroll, seiza, staff |
| 2 | 5 |  |  |  |  |  | 1boy, 1girl, circlet, hetero, looking_at_viewer, penis, bar_censor, forehead_jewel, male_pubic_hair, solo_focus, ass, cum, fellatio, nude, pov, simple_background, white_background, :>=, blush, heart-shaped_pupils, jewelry, mouth_veil, pussy, sex, thighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | arm_wrap | circlet | forehead_jewel | pauldrons | solo | thighs | bandaged_arm | bracelet | bridal_gauntlets | feathers | looking_at_viewer | thumb_ring | cleavage | sitting | covered_navel | facial_mark | armlet | pelvic_curtain | scroll | parted_lips | simple_background | white_background | blue_armor | seiza | staff | 1boy | hetero | penis | bar_censor | male_pubic_hair | solo_focus | ass | cum | fellatio | nude | pov | :>= | blush | heart-shaped_pupils | jewelry | mouth_veil | pussy | sex |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:----------|:-----------------|:------------|:-------|:---------|:---------------|:-----------|:-------------------|:-----------|:--------------------|:-------------|:-----------|:----------|:----------------|:--------------|:---------|:-----------------|:---------|:--------------|:--------------------|:-------------------|:-------------|:--------|:--------|:-------|:---------|:--------|:-------------|:------------------|:-------------|:------|:------|:-----------|:-------|:------|:------|:--------|:----------------------|:----------|:-------------|:--------|:------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | X | | | X | X | X | | | X | X | X | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | X | X | | | X | | | | | X | | | | | | | | | | X | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
anishka/UD_Treebank_Te_Transliterate | ---
license: apache-2.0
---
|
joey234/mmlu-professional_accounting-verbal-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 201532
num_examples: 282
download_size: 111563
dataset_size: 201532
---
# Dataset Card for "mmlu-professional_accounting-verbal-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/folinic_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of folinic/フォリニック/亚叶 (Arknights)
This is the dataset of folinic/フォリニック/亚叶 (Arknights), containing 69 images and their tags.
The core tags of this character are `brown_hair, long_hair, animal_ears, yellow_eyes, hair_ornament, hair_flower, breasts, multicolored_hair, cat_ears, blonde_hair, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 69 | 104.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/folinic_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 69 | 89.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/folinic_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 162 | 171.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/folinic_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/folinic_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, detached_sleeves, solo, white_flower, black_thighhighs, ponytail, toes, bare_shoulders, blush, full_body, hairclip, long_sleeves, no_shoes, official_alternate_costume, simple_background, soles, stirrup_legwear, black_shorts, cleavage, closed_mouth, infection_monitor_(arknights), looking_at_viewer, sitting, toenails, torn_thighhighs, black_gloves, collarbone, foot_focus, legs, navel, white_shirt |
| 1 | 36 |  |  |  |  |  | white_shirt, 1girl, long_sleeves, solo, looking_at_viewer, white_flower, white_jacket, black_ascot, blue_gloves, collared_shirt, simple_background, blush, smile, upper_body, bare_shoulders, black_choker, white_background, closed_mouth, hair_between_eyes, off_shoulder, open_jacket |
| 2 | 7 |  |  |  |  |  | 1girl, completely_nude, large_breasts, nipples, blush, collarbone, navel, solo_focus, looking_at_viewer, streaked_hair, sweat, two-tone_hair, white_flower, 2girls, open_mouth, sitting |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | detached_sleeves | solo | white_flower | black_thighhighs | ponytail | toes | bare_shoulders | blush | full_body | hairclip | long_sleeves | no_shoes | official_alternate_costume | simple_background | soles | stirrup_legwear | black_shorts | cleavage | closed_mouth | infection_monitor_(arknights) | looking_at_viewer | sitting | toenails | torn_thighhighs | black_gloves | collarbone | foot_focus | legs | navel | white_shirt | white_jacket | black_ascot | blue_gloves | collared_shirt | smile | upper_body | black_choker | white_background | hair_between_eyes | off_shoulder | open_jacket | completely_nude | large_breasts | nipples | solo_focus | streaked_hair | sweat | two-tone_hair | 2girls | open_mouth |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------------|:-------|:---------------|:-------------------|:-----------|:-------|:-----------------|:--------|:------------|:-----------|:---------------|:-----------|:-----------------------------|:--------------------|:--------|:------------------|:---------------|:-----------|:---------------|:--------------------------------|:--------------------|:----------|:-----------|:------------------|:---------------|:-------------|:-------------|:-------|:--------|:--------------|:---------------|:--------------|:--------------|:-----------------|:--------|:-------------|:---------------|:-------------------|:--------------------|:---------------|:--------------|:------------------|:----------------|:----------|:-------------|:----------------|:--------|:----------------|:---------|:-------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 1 | 36 |  |  |  |  |  | X | | X | X | | | | X | X | | | X | | | X | | | | | X | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | | | X | | | | | X | | | | | | | | | | | | | X | X | | | | X | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X |
|
albertvillanova/tmp-mention | ---
license: cc-by-4.0
tags:
- zenodo
---
# Dataset Card for MultiLingual LibriSpeech
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [MultiLingual LibriSpeech ASR corpus](http://www.openslr.org/94)
- **Repository:** [Needs More Information]
- **Paper:** [MLS: A Large-Scale Multilingual Dataset for Speech Research](https://arxiv.org/abs/2012.03411)
- **Leaderboard:** [Paperswithcode Leaderboard](https://paperswithcode.com/dataset/multilingual-librispeech)
### Dataset Summary
<div class="course-tip course-tip-orange bg-gradient-to-br dark:bg-gradient-to-r before:border-orange-500 dark:before:border-orange-800 from-orange-50 dark:from-gray-900 to-white dark:to-gray-950 border border-orange-50 text-orange-700 dark:text-gray-400"><p><b>Deprecated:</b> Not every model supports a fast tokenizer. Take a look at this <a href="index#supported-frameworks">table</a> to check if a model has fast tokenizer support.</p></div>
Multilingual LibriSpeech (MLS) dataset is a large multilingual corpus suitable for speech research. The dataset is derived from read audiobooks from LibriVox and consists of 8 languages - English, German, Dutch, Spanish, French, Italian, Portuguese, Polish.
### Supported Tasks and Leaderboards
- `automatic-speech-recognition`, `audio-speaker-identification`: The dataset can be used to train a model for Automatic Speech Recognition (ASR). The model is presented with an audio file and asked to transcribe the audio file to written text. The most common evaluation metric is the word error rate (WER). The task has an active leaderboard which can be found at https://paperswithcode.com/dataset/multilingual-librispeech and ranks models based on their WER.
<div class="course-tip course-tip-orange bg-gradient-to-br dark:bg-gradient-to-r before:border-orange-500 dark:before:border-orange-800 from-orange-50 dark:from-gray-900 to-white dark:to-gray-950 border border-orange-50 text-orange-700 dark:text-gray-400"><p><b>Deprecated:</b> Not every model supports a fast tokenizer. Take a look at this <a href="index#supported-frameworks">table</a> to check if a model has fast tokenizer support.</p></div>
<div class="alert alert-danger d-flex align-items-center" role="alert">
<svg class="bi flex-shrink-0 me-2" width="24" height="24" role="img" aria-label="Danger:"><use xlink:href="#exclamation-triangle-fill"/></svg>
<div>
An example danger alert with an icon
</div>
</div>
<div class="alert alert-block alert-warning"> ⚠ In general, just avoid the red boxes. </div>
<div class="alert alert-block alert-danger"> In general, just avoid the red boxes. </div>
<div class="alert alert-danger" role="alert"> In general, just avoid the red boxes. </div>
<div class="alert" role="alert"> In general, just avoid the red boxes. </div>
<div class="course-tip-orange">
<strong>Error:</strong>
</div>
<div class="alert alert-danger" role="alert">
<div class="row vertical-align">
<div class="col-xs-1 text-center">
<i class="fa fa-exclamation-triangle fa-2x"></i>
</div>
<div class="col-xs-11">
<strong>Error:</strong>
</div>
</div>
</div>
>[!WARNING]
>This is a warning
_**Warning:** Be very careful here._
<Deprecated>
This is a warning
</Deprecated>
<Tip warning>
This is a warning
</Tip>
<Tip warning={true}>
This is a warning
</Tip>
> **Warning**
> This is a warning |
sammyfroly/ladyoscar2 | ---
license: openrail
---
|
folkopinion/bert-political-statements-and-questions-swedish-ner | ---
task_categories:
- token-classification
---
# AutoTrain Dataset for project: bert-political-statements-and-questions-swedish-ner
## Dataset Description
This dataset has been automatically processed by AutoTrain for project bert-political-statements-and-questions-swedish-ner.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"tokens": [
"KD",
"ska",
"f\u00f6rhandla",
"v\u00e5rbudget",
"med",
"Milj\u00f6partiet"
],
"tags": [
1,
6,
6,
6,
6,
1
]
},
{
"tokens": [
"V\u00e4nsterpartiet",
"ska",
"diskutera",
"h\u00f6stbudget",
"med",
"Sverigedemokraterna"
],
"tags": [
1,
6,
6,
6,
6,
1
]
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"tokens": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)",
"tags": "Sequence(feature=ClassLabel(names=['B-LOC', 'B-ORG', 'B-PER', 'I-LOC', 'I-ORG', 'I-PER', 'UNK'], id=None), length=-1, id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 3147 |
| valid | 821 |
|
open-llm-leaderboard/details_NeuralNovel__Mini-Mixtral-v0.2 | ---
pretty_name: Evaluation run of NeuralNovel/Mini-Mixtral-v0.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NeuralNovel/Mini-Mixtral-v0.2](https://huggingface.co/NeuralNovel/Mini-Mixtral-v0.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NeuralNovel__Mini-Mixtral-v0.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-24T15:52:44.187947](https://huggingface.co/datasets/open-llm-leaderboard/details_NeuralNovel__Mini-Mixtral-v0.2/blob/main/results_2024-03-24T15-52-44.187947.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6369221512047082,\n\
\ \"acc_stderr\": 0.03244399463255061,\n \"acc_norm\": 0.6412715097047738,\n\
\ \"acc_norm_stderr\": 0.03309460958147091,\n \"mc1\": 0.33414932680538556,\n\
\ \"mc1_stderr\": 0.016512530677150535,\n \"mc2\": 0.5036355079944391,\n\
\ \"mc2_stderr\": 0.014726082878656196\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5819112627986348,\n \"acc_stderr\": 0.014413988396996077,\n\
\ \"acc_norm\": 0.6126279863481229,\n \"acc_norm_stderr\": 0.014235872487909869\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.644991037641904,\n\
\ \"acc_stderr\": 0.004775380866948015,\n \"acc_norm\": 0.841167098187612,\n\
\ \"acc_norm_stderr\": 0.0036477317239388294\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395268,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395268\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n\
\ \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n\
\ \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416906,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416906\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.04657047260594964,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.04657047260594964\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7322580645161291,\n\
\ \"acc_stderr\": 0.025189006660212385,\n \"acc_norm\": 0.7322580645161291,\n\
\ \"acc_norm_stderr\": 0.025189006660212385\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.034991131376767445,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.034991131376767445\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932026,\n \"\
acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932026\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6307692307692307,\n \"acc_stderr\": 0.024468615241478926,\n\
\ \"acc_norm\": 0.6307692307692307,\n \"acc_norm_stderr\": 0.024468615241478926\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547307,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547307\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.030489911417673227,\n\
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.030489911417673227\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8128440366972477,\n \"acc_stderr\": 0.016722684526200144,\n \"\
acc_norm\": 0.8128440366972477,\n \"acc_norm_stderr\": 0.016722684526200144\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565438,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565438\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.041331194402438376,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.041331194402438376\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119005,\n\
\ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119005\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5535714285714286,\n\
\ \"acc_stderr\": 0.04718471485219587,\n \"acc_norm\": 0.5535714285714286,\n\
\ \"acc_norm_stderr\": 0.04718471485219587\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n\
\ \"acc_stderr\": 0.014108533515757431,\n \"acc_norm\": 0.8071519795657727,\n\
\ \"acc_norm_stderr\": 0.014108533515757431\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7023121387283237,\n \"acc_stderr\": 0.024617055388677003,\n\
\ \"acc_norm\": 0.7023121387283237,\n \"acc_norm_stderr\": 0.024617055388677003\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.394413407821229,\n\
\ \"acc_stderr\": 0.01634538676210397,\n \"acc_norm\": 0.394413407821229,\n\
\ \"acc_norm_stderr\": 0.01634538676210397\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958147,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958147\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6790123456790124,\n \"acc_stderr\": 0.02597656601086274,\n\
\ \"acc_norm\": 0.6790123456790124,\n \"acc_norm_stderr\": 0.02597656601086274\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45371577574967403,\n\
\ \"acc_stderr\": 0.012715404841277736,\n \"acc_norm\": 0.45371577574967403,\n\
\ \"acc_norm_stderr\": 0.012715404841277736\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.0279715413701706,\n\
\ \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.0279715413701706\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6519607843137255,\n \"acc_stderr\": 0.019270998708223977,\n \
\ \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.019270998708223977\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33414932680538556,\n\
\ \"mc1_stderr\": 0.016512530677150535,\n \"mc2\": 0.5036355079944391,\n\
\ \"mc2_stderr\": 0.014726082878656196\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7884767166535123,\n \"acc_stderr\": 0.01147774768422318\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.45564821834723274,\n \
\ \"acc_stderr\": 0.013718194542485594\n }\n}\n```"
repo_url: https://huggingface.co/NeuralNovel/Mini-Mixtral-v0.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|arc:challenge|25_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|gsm8k|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hellaswag|10_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T15-52-44.187947.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T15-52-44.187947.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- '**/details_harness|winogrande|5_2024-03-24T15-52-44.187947.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-24T15-52-44.187947.parquet'
- config_name: results
data_files:
- split: 2024_03_24T15_52_44.187947
path:
- results_2024-03-24T15-52-44.187947.parquet
- split: latest
path:
- results_2024-03-24T15-52-44.187947.parquet
---
# Dataset Card for Evaluation run of NeuralNovel/Mini-Mixtral-v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NeuralNovel/Mini-Mixtral-v0.2](https://huggingface.co/NeuralNovel/Mini-Mixtral-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NeuralNovel__Mini-Mixtral-v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-24T15:52:44.187947](https://huggingface.co/datasets/open-llm-leaderboard/details_NeuralNovel__Mini-Mixtral-v0.2/blob/main/results_2024-03-24T15-52-44.187947.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6369221512047082,
"acc_stderr": 0.03244399463255061,
"acc_norm": 0.6412715097047738,
"acc_norm_stderr": 0.03309460958147091,
"mc1": 0.33414932680538556,
"mc1_stderr": 0.016512530677150535,
"mc2": 0.5036355079944391,
"mc2_stderr": 0.014726082878656196
},
"harness|arc:challenge|25": {
"acc": 0.5819112627986348,
"acc_stderr": 0.014413988396996077,
"acc_norm": 0.6126279863481229,
"acc_norm_stderr": 0.014235872487909869
},
"harness|hellaswag|10": {
"acc": 0.644991037641904,
"acc_stderr": 0.004775380866948015,
"acc_norm": 0.841167098187612,
"acc_norm_stderr": 0.0036477317239388294
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395268,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395268
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.037738099906869334,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.037738099906869334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416906,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416906
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594964,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594964
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7322580645161291,
"acc_stderr": 0.025189006660212385,
"acc_norm": 0.7322580645161291,
"acc_norm_stderr": 0.025189006660212385
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.034991131376767445,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.034991131376767445
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932026,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932026
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6307692307692307,
"acc_stderr": 0.024468615241478926,
"acc_norm": 0.6307692307692307,
"acc_norm_stderr": 0.024468615241478926
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547307,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547307
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8128440366972477,
"acc_stderr": 0.016722684526200144,
"acc_norm": 0.8128440366972477,
"acc_norm_stderr": 0.016722684526200144
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.02812597226565438,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.02812597226565438
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097653,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097653
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.041331194402438376,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.041331194402438376
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.031570650789119005,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.031570650789119005
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5535714285714286,
"acc_stderr": 0.04718471485219587,
"acc_norm": 0.5535714285714286,
"acc_norm_stderr": 0.04718471485219587
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757431,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757431
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7023121387283237,
"acc_stderr": 0.024617055388677003,
"acc_norm": 0.7023121387283237,
"acc_norm_stderr": 0.024617055388677003
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.394413407821229,
"acc_stderr": 0.01634538676210397,
"acc_norm": 0.394413407821229,
"acc_norm_stderr": 0.01634538676210397
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.025058503316958147,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.025058503316958147
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6790123456790124,
"acc_stderr": 0.02597656601086274,
"acc_norm": 0.6790123456790124,
"acc_norm_stderr": 0.02597656601086274
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45371577574967403,
"acc_stderr": 0.012715404841277736,
"acc_norm": 0.45371577574967403,
"acc_norm_stderr": 0.012715404841277736
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.0279715413701706,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.0279715413701706
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.019270998708223977,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.019270998708223977
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.33414932680538556,
"mc1_stderr": 0.016512530677150535,
"mc2": 0.5036355079944391,
"mc2_stderr": 0.014726082878656196
},
"harness|winogrande|5": {
"acc": 0.7884767166535123,
"acc_stderr": 0.01147774768422318
},
"harness|gsm8k|5": {
"acc": 0.45564821834723274,
"acc_stderr": 0.013718194542485594
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
shiqi0715/ANewTest | ---
license: openrail
---
|
fxmeng/general_policy | ---
configs:
- config_name: default
data_files:
- split: train_mc
path: data/train_mc-*
- split: test_mc
path: data/test_mc-*
- split: train_open
path: data/train_open-*
- split: test_open
path: data/test_open-*
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: board_svg
dtype: string
splits:
- name: train_mc
num_bytes: 111734126
num_examples: 3760
- name: test_mc
num_bytes: 2964558
num_examples: 100
- name: train_open
num_bytes: 116456269
num_examples: 3844
- name: test_open
num_bytes: 3023141
num_examples: 100
download_size: 32711459
dataset_size: 234178094
---
# Dataset Card for "general_policy"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kakshak/optimoz | ---
license: mit
---
|
tyzhu/find_last_sent_train_30_eval_10 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 89198
num_examples: 70
- name: validation
num_bytes: 10769
num_examples: 10
download_size: 64403
dataset_size: 99967
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "find_last_sent_train_30_eval_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chhuuchuuz/YOOYEON | ---
license: openrail
---
|
argilla/distilabel-evol-prompt-collective | ---
dataset_info:
features:
- name: source
dtype: string
- name: kind
dtype: string
- name: evolved_from
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 3330922
num_examples: 2473
download_size: 1817785
dataset_size: 3330922
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
tags:
- synthetic
- distilabel
---
|
duwuonline/en_vi_advanced_sentences | ---
license: other
language:
- vi
- en
task_categories:
- translation
---
## Model description
This data I crawled from these site: https://prep.vn/blog/idiom-theo-chu-de-trong-tieng-anh/ and https://www.enewsdispatch.com/
Idiom site I carefully translation, however, the enews site I use google translate
|
iansousa12/silveron | ---
license: mit
---
|
CyberHarem/tanaka_mamimi_theidolmstershinycolors | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of tanaka_mamimi/田中摩美々/타나카마미미 (THE iDOLM@STER: SHINY COLORS)
This is the dataset of tanaka_mamimi/田中摩美々/타나카마미미 (THE iDOLM@STER: SHINY COLORS), containing 500 images and their tags.
The core tags of this character are `purple_hair, bangs, purple_eyes, diagonal_bangs, twintails, breasts, earrings, short_twintails`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 801.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tanaka_mamimi_theidolmstershinycolors/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 415.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tanaka_mamimi_theidolmstershinycolors/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1203 | 898.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tanaka_mamimi_theidolmstershinycolors/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 686.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tanaka_mamimi_theidolmstershinycolors/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1203 | 1.36 GiB | [Download](https://huggingface.co/datasets/CyberHarem/tanaka_mamimi_theidolmstershinycolors/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/tanaka_mamimi_theidolmstershinycolors',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, long_hair, neck_ribbon, school_uniform, solo, blazer, red_ribbon, white_shirt, blush, long_sleeves, looking_at_viewer, black_jacket, collared_shirt, pleated_skirt, open_jacket, smile, black_skirt, book, dress_shirt, hair_bow, parted_lips |
| 1 | 8 |  |  |  |  |  | 1girl, garter_straps, looking_at_viewer, plaid_skirt, pleated_skirt, school_uniform, single_thighhigh, solo, thigh_strap, white_shirt, green_jacket, blush, nail_polish, off_shoulder, simple_background, white_background, jewelry, long_sleeves, plaid_bowtie, purple_nails, black_choker, open_jacket, open_mouth, sleeves_past_wrists, spiked_choker |
| 2 | 6 |  |  |  |  |  | 1girl, collarbone, ear_piercing, solo, bare_shoulders, black_choker, cleavage, looking_at_viewer, nail_polish, off_shoulder, black_nails, blush, long_sleeves, makeup, necklace, blunt_bangs, green_jacket, open_jacket, purple_lips, simple_background, upper_body |
| 3 | 5 |  |  |  |  |  | black_gloves, cleavage, elbow_gloves, jewelry, long_hair, looking_at_viewer, 1girl, chain, medium_breasts, midriff, navel, skirt, solo, choker, thighhighs, demon_horns, facial_mark, feathers, sitting, smile |
| 4 | 28 |  |  |  |  |  | 1boy, 1girl, hetero, penis, solo_focus, blush, jewelry, mosaic_censoring, erection, tongue_out, cum, fellatio, looking_at_viewer, nail_polish, choker, piercing, male_pubic_hair, pov, purple_nails, sweat |
| 5 | 6 |  |  |  |  |  | 1girl, hat, long_hair, looking_at_viewer, choker, grin, midriff, navel, solo, black_headwear, blush, crop_top, bare_shoulders, ear_piercing, nail_polish, off_shoulder, purple_jacket, ring, skirt |
| 6 | 5 |  |  |  |  |  | 1girl, hairclip, long_sleeves, looking_at_viewer, smile, solo, closed_mouth, floral_print, jewelry, medium_breasts, nail_polish, black_skirt, blunt_bangs, flower, green_shirt, shawl, sitting, sweater, black_belt, blush, holding, purple_pantyhose, simple_background, turtleneck, white_background |
| 7 | 16 |  |  |  |  |  | 1girl, solo, looking_at_viewer, jewelry, upper_body, kimono, black_gloves, dress, hair_bow, bare_shoulders, fur_trim, smile |
| 8 | 8 |  |  |  |  |  | black_gloves, choker, 1girl, bare_shoulders, dress, goggles_on_head, looking_at_viewer, solo, bow, detached_sleeves, rose, corset, gears, open_mouth, shorts, asymmetrical_legwear, criss-cross_halter, frills, holding, steampunk, boots, microphone, short_sleeves, simple_background, single_thighhigh, smile, white_background |
| 9 | 7 |  |  |  |  |  | bare_shoulders, fake_animal_ears, looking_at_viewer, playboy_bunny, rabbit_ears, 1girl, black_leotard, blush, medium_breasts, wrist_cuffs, cleavage, detached_collar, pantyhose, solo, blunt_bangs, rabbit_tail, strapless_leotard, black_nails, bottle, bowtie, hair_ornament, long_hair, nail_polish |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | long_hair | neck_ribbon | school_uniform | solo | blazer | red_ribbon | white_shirt | blush | long_sleeves | looking_at_viewer | black_jacket | collared_shirt | pleated_skirt | open_jacket | smile | black_skirt | book | dress_shirt | hair_bow | parted_lips | garter_straps | plaid_skirt | single_thighhigh | thigh_strap | green_jacket | nail_polish | off_shoulder | simple_background | white_background | jewelry | plaid_bowtie | purple_nails | black_choker | open_mouth | sleeves_past_wrists | spiked_choker | collarbone | ear_piercing | bare_shoulders | cleavage | black_nails | makeup | necklace | blunt_bangs | purple_lips | upper_body | black_gloves | elbow_gloves | chain | medium_breasts | midriff | navel | skirt | choker | thighhighs | demon_horns | facial_mark | feathers | sitting | 1boy | hetero | penis | solo_focus | mosaic_censoring | erection | tongue_out | cum | fellatio | piercing | male_pubic_hair | pov | sweat | hat | grin | black_headwear | crop_top | purple_jacket | ring | hairclip | closed_mouth | floral_print | flower | green_shirt | shawl | sweater | black_belt | holding | purple_pantyhose | turtleneck | kimono | dress | fur_trim | goggles_on_head | bow | detached_sleeves | rose | corset | gears | shorts | asymmetrical_legwear | criss-cross_halter | frills | steampunk | boots | microphone | short_sleeves | fake_animal_ears | playboy_bunny | rabbit_ears | black_leotard | wrist_cuffs | detached_collar | pantyhose | rabbit_tail | strapless_leotard | bottle | bowtie | hair_ornament |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------|:--------------|:-----------------|:-------|:---------|:-------------|:--------------|:--------|:---------------|:--------------------|:---------------|:-----------------|:----------------|:--------------|:--------|:--------------|:-------|:--------------|:-----------|:--------------|:----------------|:--------------|:-------------------|:--------------|:---------------|:--------------|:---------------|:--------------------|:-------------------|:----------|:---------------|:---------------|:---------------|:-------------|:----------------------|:----------------|:-------------|:---------------|:-----------------|:-----------|:--------------|:---------|:-----------|:--------------|:--------------|:-------------|:---------------|:---------------|:--------|:-----------------|:----------|:--------|:--------|:---------|:-------------|:--------------|:--------------|:-----------|:----------|:-------|:---------|:--------|:-------------|:-------------------|:-----------|:-------------|:------|:-----------|:-----------|:------------------|:------|:--------|:------|:-------|:-----------------|:-----------|:----------------|:-------|:-----------|:---------------|:---------------|:---------|:--------------|:--------|:----------|:-------------|:----------|:-------------------|:-------------|:---------|:--------|:-----------|:------------------|:------|:-------------------|:-------|:---------|:--------|:---------|:-----------------------|:---------------------|:---------|:------------|:--------|:-------------|:----------------|:-------------------|:----------------|:--------------|:----------------|:--------------|:------------------|:------------|:--------------|:--------------------|:---------|:---------|:----------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | | | X | X | | | X | X | X | X | | | X | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | | | | X | | | | X | X | X | | | | X | | | | | | | | | | | X | X | X | X | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | | | X | | | | | | X | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 28 |  |  |  |  |  | X | | | | | | | | X | | X | | | | | | | | | | | | | | | | X | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | X | | | X | | | | X | | X | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | X | X | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | | | X | | | | X | X | X | | | | | X | X | | | | | | | | | | X | | X | X | X | | | | | | | | | | | | | | X | | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 16 |  |  |  |  |  | X | | | | X | | | | | | X | | | | | X | | | | X | | | | | | | | | | | X | | | | | | | | | X | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 8 |  |  |  |  |  | X | | | | X | | | | | | X | | | | | X | | | | | | | | X | | | | | X | X | | | | | X | | | | | X | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 9 | 7 |  |  |  |  |  | X | X | | | X | | | | X | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | X | X | X | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
huggingartists/melanie-martinez | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/melanie-martinez"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.46438 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/917de5970c2afbbf03a7705f18eb6951.811x811x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/melanie-martinez">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Melanie Martinez</div>
<a href="https://genius.com/artists/melanie-martinez">
<div style="text-align: center; font-size: 14px;">@melanie-martinez</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/melanie-martinez).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/melanie-martinez")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|329| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/melanie-martinez")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
joey234/mmlu-world_religions-dev | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: dev
num_bytes: 1878
num_examples: 5
download_size: 5710
dataset_size: 1878
---
# Dataset Card for "mmlu-world_religions-dev"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EgilKarlsen/BGL_DistilRoBERTa_Finetuned | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 115582709.0625
num_examples: 37500
- name: test
num_bytes: 38527570.0
num_examples: 12500
download_size: 211882718
dataset_size: 154110279.0625
---
# Dataset Card for "BGL_DistilRoBERTa_Finetuned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Intuit-GenSRF/jigsaw-toxic-comment | ---
dataset_info:
features:
- name: text
dtype: string
- name: labels
sequence: string
splits:
- name: train
num_bytes: 64586545
num_examples: 159571
download_size: 41105413
dataset_size: 64586545
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "jigsaw-toxic-comment"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
visual-layer/vl-laion-1b | ---
license: other
---
|
vvuri/openassistant-guanaco-ru | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1957697
num_examples: 709
- name: test
num_bytes: 105639
num_examples: 39
download_size: 999023
dataset_size: 2063336
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
PaulAdversarial/all_news_finance_sm_1h2023 | ---
license: afl-3.0
---
|
ozoromo/DiskStrukt2023-VL | ---
tags:
- code
- lecture
- math
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
Embeddings made using transcripts of the Diskrete Strukturen SOSE2023 lectures usable for OpenAis chatGPT and probably other stuff.
## Dataset Structure
The Dataset is stored in the form of a Chroma DB |
jimmypjoy/test_dataset1 | ---
dataset_info:
features:
- name: title
dtype: string
- name: body
dtype: string
splits:
- name: train
num_bytes: 13792576
num_examples: 17262
- name: validation
num_bytes: 1870389
num_examples: 2158
- name: test
num_bytes: 1379190
num_examples: 2158
download_size: 10073414
dataset_size: 17042155
---
# Dataset Card for "test_dataset1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/random_letter_find_passage_train10_eval40_num | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 4674
num_examples: 60
- name: validation
num_bytes: 4480
num_examples: 40
download_size: 8542
dataset_size: 9154
---
# Dataset Card for "random_letter_find_passage_train10_eval40_num"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/Chinese-English_Parallel_Corpus_Data | ---
task_categories:
- translation
language:
- zh
- en
---
# Dataset Card for Nexdata/Chinese-English_Parallel_Corpus_Data
## Description
3,060,000 sets of parallel translation corpus between Chinese and English. It is stored in txt files. It covers files like travel, medicine, daily and TV play. Data cleaning, desensitization, and quality inspection have been carried out. It can be used as the basic corpus database in text data file as well as used in machine translation.
For more details, please refer to the link: https://www.nexdata.ai/datasets/147?source=Huggingface
# Specifications
## Storage format
TXT
## Data content
Chinese-English Parallel Corpus Data
## Data size
3.06 million pairs of Chinese-English Parallel Corpus Data. The Chinese sentences contain 4-25 characters
## Language
Chinese, English
## Application scenario
machine translation
# Licensing Information
Commercial License |
open-llm-leaderboard/details_abhinand__TinyLlama-1.1B-OpenHermes-2.5-Chat-v0.1-sft | ---
pretty_name: Evaluation run of abhinand/TinyLlama-1.1B-OpenHermes-2.5-Chat-v0.1-sft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [abhinand/TinyLlama-1.1B-OpenHermes-2.5-Chat-v0.1-sft](https://huggingface.co/abhinand/TinyLlama-1.1B-OpenHermes-2.5-Chat-v0.1-sft)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abhinand__TinyLlama-1.1B-OpenHermes-2.5-Chat-v0.1-sft\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-09T17:14:23.024715](https://huggingface.co/datasets/open-llm-leaderboard/details_abhinand__TinyLlama-1.1B-OpenHermes-2.5-Chat-v0.1-sft/blob/main/results_2024-02-09T17-14-23.024715.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25230068016115625,\n\
\ \"acc_stderr\": 0.030498670802431283,\n \"acc_norm\": 0.25259575273482276,\n\
\ \"acc_norm_stderr\": 0.03119964119680332,\n \"mc1\": 0.21909424724602203,\n\
\ \"mc1_stderr\": 0.014480038578757442,\n \"mc2\": 0.3621952768373166,\n\
\ \"mc2_stderr\": 0.013699293770021182\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.30802047781569963,\n \"acc_stderr\": 0.01349142951729204,\n\
\ \"acc_norm\": 0.3378839590443686,\n \"acc_norm_stderr\": 0.01382204792228351\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4411471818362876,\n\
\ \"acc_stderr\": 0.004955095096264714,\n \"acc_norm\": 0.5872336188010356,\n\
\ \"acc_norm_stderr\": 0.004913253031155673\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n\
\ \"acc_stderr\": 0.03633384414073465,\n \"acc_norm\": 0.22962962962962963,\n\
\ \"acc_norm_stderr\": 0.03633384414073465\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.03317672787533157,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.03317672787533157\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2339622641509434,\n \"acc_stderr\": 0.02605529690115292,\n\
\ \"acc_norm\": 0.2339622641509434,\n \"acc_norm_stderr\": 0.02605529690115292\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.16,\n \"acc_stderr\": 0.0368452949177471,\n \"acc_norm\"\
: 0.16,\n \"acc_norm_stderr\": 0.0368452949177471\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.1676300578034682,\n\
\ \"acc_stderr\": 0.028481963032143377,\n \"acc_norm\": 0.1676300578034682,\n\
\ \"acc_norm_stderr\": 0.028481963032143377\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617746,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617746\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.251063829787234,\n \"acc_stderr\": 0.028346963777162452,\n\
\ \"acc_norm\": 0.251063829787234,\n \"acc_norm_stderr\": 0.028346963777162452\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.03455930201924811,\n\
\ \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.03455930201924811\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.02278967314577657,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.02278967314577657\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n\
\ \"acc_stderr\": 0.035122074123020534,\n \"acc_norm\": 0.19047619047619047,\n\
\ \"acc_norm_stderr\": 0.035122074123020534\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.15,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.15,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.20967741935483872,\n\
\ \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.20967741935483872,\n\
\ \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.19704433497536947,\n \"acc_stderr\": 0.02798672466673621,\n\
\ \"acc_norm\": 0.19704433497536947,\n \"acc_norm_stderr\": 0.02798672466673621\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\"\
: 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.03546563019624336,\n\
\ \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.03546563019624336\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.21212121212121213,\n \"acc_stderr\": 0.02912652283458682,\n \"\
acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.02912652283458682\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.02977866303775295,\n\
\ \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.02977866303775295\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.23846153846153847,\n \"acc_stderr\": 0.021606294494647727,\n\
\ \"acc_norm\": 0.23846153846153847,\n \"acc_norm_stderr\": 0.021606294494647727\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.22268907563025211,\n \"acc_stderr\": 0.027025433498882378,\n\
\ \"acc_norm\": 0.22268907563025211,\n \"acc_norm_stderr\": 0.027025433498882378\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2119205298013245,\n \"acc_stderr\": 0.03336767086567977,\n \"\
acc_norm\": 0.2119205298013245,\n \"acc_norm_stderr\": 0.03336767086567977\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24587155963302754,\n \"acc_stderr\": 0.018461940968708446,\n \"\
acc_norm\": 0.24587155963302754,\n \"acc_norm_stderr\": 0.018461940968708446\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3194444444444444,\n \"acc_stderr\": 0.03179876342176851,\n \"\
acc_norm\": 0.3194444444444444,\n \"acc_norm_stderr\": 0.03179876342176851\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.27941176470588236,\n \"acc_stderr\": 0.031493281045079556,\n \"\
acc_norm\": 0.27941176470588236,\n \"acc_norm_stderr\": 0.031493281045079556\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.28270042194092826,\n \"acc_stderr\": 0.02931281415395592,\n \
\ \"acc_norm\": 0.28270042194092826,\n \"acc_norm_stderr\": 0.02931281415395592\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.35874439461883406,\n\
\ \"acc_stderr\": 0.03219079200419995,\n \"acc_norm\": 0.35874439461883406,\n\
\ \"acc_norm_stderr\": 0.03219079200419995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22137404580152673,\n \"acc_stderr\": 0.03641297081313729,\n\
\ \"acc_norm\": 0.22137404580152673,\n \"acc_norm_stderr\": 0.03641297081313729\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.256198347107438,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\"\
: 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n\
\ \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n\
\ \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26993865030674846,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.26993865030674846,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.14563106796116504,\n \"acc_stderr\": 0.0349260647662379,\n\
\ \"acc_norm\": 0.14563106796116504,\n \"acc_norm_stderr\": 0.0349260647662379\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.27350427350427353,\n\
\ \"acc_stderr\": 0.029202540153431173,\n \"acc_norm\": 0.27350427350427353,\n\
\ \"acc_norm_stderr\": 0.029202540153431173\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2796934865900383,\n\
\ \"acc_stderr\": 0.01605079214803654,\n \"acc_norm\": 0.2796934865900383,\n\
\ \"acc_norm_stderr\": 0.01605079214803654\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.023445826276545536,\n\
\ \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.023445826276545536\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3022508038585209,\n\
\ \"acc_stderr\": 0.02608270069539965,\n \"acc_norm\": 0.3022508038585209,\n\
\ \"acc_norm_stderr\": 0.02608270069539965\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25617283950617287,\n \"acc_stderr\": 0.0242885336377261,\n\
\ \"acc_norm\": 0.25617283950617287,\n \"acc_norm_stderr\": 0.0242885336377261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.22340425531914893,\n \"acc_stderr\": 0.02484792135806396,\n \
\ \"acc_norm\": 0.22340425531914893,\n \"acc_norm_stderr\": 0.02484792135806396\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24771838331160365,\n\
\ \"acc_stderr\": 0.011025499291443738,\n \"acc_norm\": 0.24771838331160365,\n\
\ \"acc_norm_stderr\": 0.011025499291443738\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.22426470588235295,\n \"acc_stderr\": 0.025336848563332338,\n\
\ \"acc_norm\": 0.22426470588235295,\n \"acc_norm_stderr\": 0.025336848563332338\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.26143790849673204,\n \"acc_stderr\": 0.017776947157528034,\n \
\ \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.017776947157528034\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.24545454545454545,\n\
\ \"acc_stderr\": 0.04122066502878284,\n \"acc_norm\": 0.24545454545454545,\n\
\ \"acc_norm_stderr\": 0.04122066502878284\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.1673469387755102,\n \"acc_stderr\": 0.023897144768914524,\n\
\ \"acc_norm\": 0.1673469387755102,\n \"acc_norm_stderr\": 0.023897144768914524\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n\
\ \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n\
\ \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.23391812865497075,\n \"acc_stderr\": 0.03246721765117826,\n\
\ \"acc_norm\": 0.23391812865497075,\n \"acc_norm_stderr\": 0.03246721765117826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21909424724602203,\n\
\ \"mc1_stderr\": 0.014480038578757442,\n \"mc2\": 0.3621952768373166,\n\
\ \"mc2_stderr\": 0.013699293770021182\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6093133385951065,\n \"acc_stderr\": 0.013712536036556647\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.053828658074298714,\n \
\ \"acc_stderr\": 0.00621632864023813\n }\n}\n```"
repo_url: https://huggingface.co/abhinand/TinyLlama-1.1B-OpenHermes-2.5-Chat-v0.1-sft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|arc:challenge|25_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|gsm8k|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hellaswag|10_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T17-14-23.024715.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T17-14-23.024715.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- '**/details_harness|winogrande|5_2024-02-09T17-14-23.024715.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-09T17-14-23.024715.parquet'
- config_name: results
data_files:
- split: 2024_02_09T17_14_23.024715
path:
- results_2024-02-09T17-14-23.024715.parquet
- split: latest
path:
- results_2024-02-09T17-14-23.024715.parquet
---
# Dataset Card for Evaluation run of abhinand/TinyLlama-1.1B-OpenHermes-2.5-Chat-v0.1-sft
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abhinand/TinyLlama-1.1B-OpenHermes-2.5-Chat-v0.1-sft](https://huggingface.co/abhinand/TinyLlama-1.1B-OpenHermes-2.5-Chat-v0.1-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abhinand__TinyLlama-1.1B-OpenHermes-2.5-Chat-v0.1-sft",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-09T17:14:23.024715](https://huggingface.co/datasets/open-llm-leaderboard/details_abhinand__TinyLlama-1.1B-OpenHermes-2.5-Chat-v0.1-sft/blob/main/results_2024-02-09T17-14-23.024715.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25230068016115625,
"acc_stderr": 0.030498670802431283,
"acc_norm": 0.25259575273482276,
"acc_norm_stderr": 0.03119964119680332,
"mc1": 0.21909424724602203,
"mc1_stderr": 0.014480038578757442,
"mc2": 0.3621952768373166,
"mc2_stderr": 0.013699293770021182
},
"harness|arc:challenge|25": {
"acc": 0.30802047781569963,
"acc_stderr": 0.01349142951729204,
"acc_norm": 0.3378839590443686,
"acc_norm_stderr": 0.01382204792228351
},
"harness|hellaswag|10": {
"acc": 0.4411471818362876,
"acc_stderr": 0.004955095096264714,
"acc_norm": 0.5872336188010356,
"acc_norm_stderr": 0.004913253031155673
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073465,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073465
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2339622641509434,
"acc_stderr": 0.02605529690115292,
"acc_norm": 0.2339622641509434,
"acc_norm_stderr": 0.02605529690115292
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.16,
"acc_stderr": 0.0368452949177471,
"acc_norm": 0.16,
"acc_norm_stderr": 0.0368452949177471
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.1676300578034682,
"acc_stderr": 0.028481963032143377,
"acc_norm": 0.1676300578034682,
"acc_norm_stderr": 0.028481963032143377
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617746,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617746
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.251063829787234,
"acc_stderr": 0.028346963777162452,
"acc_norm": 0.251063829787234,
"acc_norm_stderr": 0.028346963777162452
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2206896551724138,
"acc_stderr": 0.03455930201924811,
"acc_norm": 0.2206896551724138,
"acc_norm_stderr": 0.03455930201924811
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.02278967314577657,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.02278967314577657
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.035122074123020534,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.035122074123020534
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.15,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.15,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.20967741935483872,
"acc_stderr": 0.023157879349083522,
"acc_norm": 0.20967741935483872,
"acc_norm_stderr": 0.023157879349083522
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.19704433497536947,
"acc_stderr": 0.02798672466673621,
"acc_norm": 0.19704433497536947,
"acc_norm_stderr": 0.02798672466673621
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.02912652283458682,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.02912652283458682
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.02977866303775295,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.02977866303775295
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23846153846153847,
"acc_stderr": 0.021606294494647727,
"acc_norm": 0.23846153846153847,
"acc_norm_stderr": 0.021606294494647727
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.22268907563025211,
"acc_stderr": 0.027025433498882378,
"acc_norm": 0.22268907563025211,
"acc_norm_stderr": 0.027025433498882378
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2119205298013245,
"acc_stderr": 0.03336767086567977,
"acc_norm": 0.2119205298013245,
"acc_norm_stderr": 0.03336767086567977
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24587155963302754,
"acc_stderr": 0.018461940968708446,
"acc_norm": 0.24587155963302754,
"acc_norm_stderr": 0.018461940968708446
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3194444444444444,
"acc_stderr": 0.03179876342176851,
"acc_norm": 0.3194444444444444,
"acc_norm_stderr": 0.03179876342176851
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27941176470588236,
"acc_stderr": 0.031493281045079556,
"acc_norm": 0.27941176470588236,
"acc_norm_stderr": 0.031493281045079556
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.28270042194092826,
"acc_stderr": 0.02931281415395592,
"acc_norm": 0.28270042194092826,
"acc_norm_stderr": 0.02931281415395592
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.35874439461883406,
"acc_stderr": 0.03219079200419995,
"acc_norm": 0.35874439461883406,
"acc_norm_stderr": 0.03219079200419995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22137404580152673,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.22137404580152673,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052192,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26993865030674846,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.26993865030674846,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.14563106796116504,
"acc_stderr": 0.0349260647662379,
"acc_norm": 0.14563106796116504,
"acc_norm_stderr": 0.0349260647662379
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.27350427350427353,
"acc_stderr": 0.029202540153431173,
"acc_norm": 0.27350427350427353,
"acc_norm_stderr": 0.029202540153431173
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2796934865900383,
"acc_stderr": 0.01605079214803654,
"acc_norm": 0.2796934865900383,
"acc_norm_stderr": 0.01605079214803654
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.023445826276545536,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.023445826276545536
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3022508038585209,
"acc_stderr": 0.02608270069539965,
"acc_norm": 0.3022508038585209,
"acc_norm_stderr": 0.02608270069539965
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25617283950617287,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.25617283950617287,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.22340425531914893,
"acc_stderr": 0.02484792135806396,
"acc_norm": 0.22340425531914893,
"acc_norm_stderr": 0.02484792135806396
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24771838331160365,
"acc_stderr": 0.011025499291443738,
"acc_norm": 0.24771838331160365,
"acc_norm_stderr": 0.011025499291443738
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.22426470588235295,
"acc_stderr": 0.025336848563332338,
"acc_norm": 0.22426470588235295,
"acc_norm_stderr": 0.025336848563332338
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.017776947157528034,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.017776947157528034
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.04122066502878284,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.04122066502878284
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.1673469387755102,
"acc_stderr": 0.023897144768914524,
"acc_norm": 0.1673469387755102,
"acc_norm_stderr": 0.023897144768914524
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916714,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916714
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.23391812865497075,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.23391812865497075,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21909424724602203,
"mc1_stderr": 0.014480038578757442,
"mc2": 0.3621952768373166,
"mc2_stderr": 0.013699293770021182
},
"harness|winogrande|5": {
"acc": 0.6093133385951065,
"acc_stderr": 0.013712536036556647
},
"harness|gsm8k|5": {
"acc": 0.053828658074298714,
"acc_stderr": 0.00621632864023813
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
freshpearYoon/vr_train_free_7 | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: filename
dtype: string
- name: NumOfUtterance
dtype: int64
- name: text
dtype: string
- name: samplingrate
dtype: int64
- name: begin_time
dtype: float64
- name: end_time
dtype: float64
- name: speaker_id
dtype: string
- name: directory
dtype: string
splits:
- name: train
num_bytes: 8106268307
num_examples: 10000
download_size: 1632785336
dataset_size: 8106268307
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ciempiess/tele_con_ciencia | ---
license: cc-by-4.0
---
|
PJMixers/grimulkan_bluemoon_Karen_cleaned-carded-formatted | ---
language:
- en
source_datasets:
- grimulkan/bluemoon_Karen_cleaned
- PJMixers/grimulkan_bluemoon_Karen_cleaned-carded
tags:
- not-for-all-audiences
- roleplay
- role-play
- role play
- rp
- bluemoon
- blue moon
---
Just a simple text replace of the tags.
```
First Character: The Beast
Second Character: Belle
First Character Description: A mysterious and intimidating figure, resembling a beast with a cape swishing behind him. He has an imposing presence, which he uses to assert dominance over others in his castle. His personality is stern and authoritative; he is not afraid to enforce rules or punish those who disobey him. Despite this harsh exterior, The Beast also displays signs of vulnerability and loneliness.
Second Character Description: A young brunette woman with a strong sense of self-reliance and determination. She's resourceful and quick-thinking, often taking charge in situations that require decisive action. Her compassionate nature shines through when it comes to helping others, especially her father whom she deeply cares for. Despite the challenges she faces, Belle maintains an optimistic outlook on life and isn't afraid to stand up against adversity.
Scenario: A young woman named Belle goes to a castle in search of her missing father, only to find herself confronted by The Beast, who has taken him prisoner. Despite his warning for her to leave, she insists on saving her father and pleads with the shadowy figure above her. However, The Beast threatens that if she doesn't comply, she will be imprisoned as well.
``` |
gvlk/celebqa | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: input_ids
sequence: int32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 2353095
num_examples: 870
download_size: 309619
dataset_size: 2353095
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ACT8113/Veibae | ---
license: openrail
---
|
Seanxh/twitter_dataset_1713189196 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 35100
num_examples: 79
download_size: 17964
dataset_size: 35100
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
EleutherAI/quirky_hemisphere_bob_easy | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: difficulty
dtype: float64
- name: statement
dtype: string
- name: choices
sequence: string
- name: character
dtype: string
- name: label
dtype: bool
splits:
- name: train
num_bytes: 93929.52996129722
num_examples: 936
- name: validation
num_bytes: 48768.642
num_examples: 486
- name: test
num_bytes: 58158.195
num_examples: 580
download_size: 58797
dataset_size: 200856.36696129723
---
# Dataset Card for "quirky_hemisphere_bob_easy"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dominguesm/canarim | ---
language: pt
license: cc-by-4.0
multilinguality:
- monolingual
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
size_categories:
- 100M<n<1B
dataset_info:
features:
- name: url
dtype: string
- name: content_languages
dtype: string
- name: warc_filename
dtype: string
- name: warc_record_offset
dtype: int64
- name: warc_record_length
dtype: int64
- name: text
dtype: string
- name: crawl_timestamp
dtype: string
splits:
- name: train
num_bytes: 1087519823221
num_examples: 342818651
download_size: 1087713663056
dataset_size: 1087519823221
pretty_name: Canarim
---
<p align="center">
<img width="250" alt="Camarim Logo" src="https://raw.githubusercontent.com/DominguesM/Canarim-Instruct-PTBR/main/assets/canarim.png">
</p>
<p align="center">
<a href="https://github.com/DominguesM/canarim">[🐱 GitHub]</a>
</p>
<hr>
# Canarim: A Large-Scale Dataset of Web Pages in the Portuguese Language
## Introduction
Canarim is a database encompassing over 342 million Portuguese language documents, sourced from multiple iterations of CommonCrawl. This nearly 1 terabyte database stands as one of the most extensive Portuguese language data collections available. It underwent initial deduplication using URLs, with plans for further text-based deduplication and filtering of potentially harmful content. The data, originally in HTML, has been converted to Markdown with the `Trafilatura` library to enhance readability and quality. Canarim is poised to be a crucial resource for NLP research, particularly in Portuguese language applications, filling the gap in large-scale, high-quality data for languages other than English.
## Dataset Structure
### Data Instances
An example looks as follows:
```json
{
"url": "...",
"content_languages": "por",
"warc_filename": "crawl-data/CC-MAIN-2023-06/segments/1674764500041.18/warc/CC-MAIN-20230202200542-20230202230542-00352.warc.gz",
"warc_record_offset": 971279893,
"warc_record_length": 3873,
"text": "...",
"crawl_timestamp": "2023-02-02T20:28:21Z"
}
```
### Data Fields
- `url`: URL of the page
- `content_languages`: Language of the page
- `warc_filename`: Name of the WARC file
- `warc_record_offset`: Offset of the WARC record
- `warc_record_length`: Length of the WARC record
- `text`: Text of the page, in Markdown format
- `crawl_timestamp`: Timestamp of the crawl
## Text Extraction Overview
The Canarim database employs the [`Trafilatura`](https://trafilatura.readthedocs.io) library for extracting textual content from HTML data, converting it into Markdown format. This tool focuses on preserving key textual elements like titles, subtitles, bold, and italic formatting in Markdown, ensuring the retention of the original document structure. During the extraction process, Trafilatura discards comments and other non-essential information, streamlining the content to include only the main body of the web pages.
</br>
<p align="center">
<img width="800" alt="Text Extraction Example" src="https://raw.githubusercontent.com/DominguesM/canarim/main/assets/canarim-text-extraction-preview.png">
</p>
<p align="center">
<a href="https://g1.globo.com/ac/acre/natureza/amazonia/noticia/2023/01/03/para-comemorar-40-anos-do-parque-zoobotanico-da-ufac-livro-vai-reunir-depoimentos-de-envolvidos-no-inicio-do-projeto.ghtml" target="_blank">Original Web Page</a> and
<a href="https://github.com/DominguesM/canarim/blob/main/assets/extracted_text.md" target="_blank">Extracted Text</a>
</p>
## Usage
Below is an example of how to quickly explore just a few samples from a dataset using the `datasets` library.
```python
!pip install -q datasets
from datasets import load_dataset
ds = load_dataset(
"dominguesm/canarim",
# Filter only the data from the `train split`
split="train",
# Filter only the files that contain the prefix `train/data-0019` and the suffix `-of-00192.arrow`
data_files="train/data-0019*-of-00192.arrow",
# Load the dataset without downloading the data (Streaming mode)
streaming=True
)
# From the returned data, filter only the data where the `url` value starts with `https://g1.globo.com/`
ds_globo = ds.filter(
lambda example: example['url'].startswith("https://g1.globo.com/")
)
# Return the first 10 examples from the applied filter.
data = list(ds_globo.take(10))
print(data[0])
# {
# "url": "https://g1.globo.com/ac/acre/(...)",
# "content_languages": "por",
# "warc_filename": "crawl-data/CC-MAIN-2023-06/segments/1674764499919.70/warc/CC-MAIN-20230201081311-20230201111311-00552.warc.gz",
# "warc_record_offset": 281625400,
# "warc_record_length": 192934,
# "text": "Parque Zoobotânico da Ufac guarda uma grande variedade espécies de árvores em Rio Branco — Foto: Arquivo/Ufac (...)",
# "crawl_timestamp": "2023-02-01T10:38:52Z"
# }
```
## Dataset Statistics
| Split | # Samples | # Size (bytes) | # Size (GB) |
| ------ | --------- | -------------- | ----------- |
| Train | 342,818,651 | 1,087,519,823,221 | 1087,51 |
## Citing
If you use Canarim in your research, please cite the following.
```bibtex
@misc {maicon_domingues_2024,
author = { {Maicon Domingues} },
title = { canarim (Revision 640e079) },
year = 2024,
url = { https://huggingface.co/datasets/dominguesm/canarim },
doi = { 10.57967/hf/1605 },
publisher = { Hugging Face }
}
```
## License
This dataset is licensed under the [Creative Commons Attribution 4.0 International (CC BY 4.0)](https://creativecommons.org/licenses/by/4.0/). You can use the dataset for any purpose, but you must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.
## Contact
For any questions or suggestions, please contact [Maicon Domingues](https://nlp.rocks/). |
cwchoi/whisper_medium_tele | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 24890279888
num_examples: 25912
- name: test
num_bytes: 3112243600
num_examples: 3240
- name: valid
num_bytes: 3111296216
num_examples: 3239
download_size: 4947937951
dataset_size: 31113819704
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
AnachronicRodent/MikwaTest | ---
license: cc-by-nc-4.0
---
|
LexiconShiftInnovations/Dental_QnA_Instruct | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1502062
num_examples: 2474
download_size: 513622
dataset_size: 1502062
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
parksimon0808/prm800k-mistral-verifier | ---
dataset_info:
features:
- name: texts
dtype: string
- name: input_ids
sequence: int32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 4539556004
num_examples: 1052290
- name: test
num_bytes: 145304218
num_examples: 32408
download_size: 342834121
dataset_size: 4684860222
---
# Dataset Card for "prm800k-llama-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KagglingFace/vit-cats-dogs | ---
license: mit
---
|
CyberHarem/theresa_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Theresa (Arknights)
This is the dataset of Theresa (Arknights), containing 90 images and their tags.
The core tags of this character are `horns, long_hair, pink_hair, very_long_hair, hair_between_eyes, red_eyes, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 90 | 171.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/theresa_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 90 | 141.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/theresa_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 217 | 270.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/theresa_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/theresa_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------|
| 0 | 90 |  |  |  |  |  | 1girl, solo, white_dress, long_sleeves, looking_at_viewer, closed_mouth, smile, simple_background, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | white_dress | long_sleeves | looking_at_viewer | closed_mouth | smile | simple_background | upper_body |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------|:---------------|:--------------------|:---------------|:--------|:--------------------|:-------------|
| 0 | 90 |  |  |  |  |  | X | X | X | X | X | X | X | X | X |
|
Saripudin/autotrain-data-bbc-news-classifier | ---
task_categories:
- text-classification
---
# AutoTrain Dataset for project: bbc-news-classifier
## Dataset Description
This dataset has been automatically processed by AutoTrain for project bbc-news-classifier.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "tv debate urged for party chiefs broadcasters should fix a date for a pre-election televised debate between the three main political leaders according to the hansard society. it would then be up to tony blair michael howard and charles kennedy to decide whether to take part the non-partisan charity said. chairman lord holme argued that prime ministers should not have the right of veto on a matter of public interest . the broadcasters should make the decision to go ahead he said. lord holme s proposal for a televised debate comes just four months after millions of viewers were able to watch us president george w bush slug it out verbally with his democratic challenger john kerry. he said it was a democratically dubious proposition that it was up to the incumbent prime minister to decide whether a similar event takes place here. if mr blair did not want to take part the broadcasters could go ahead with an empty chair or cancel the event and explain their reasons why lord holme said. what makes the present situation even less acceptable is that although mr howard and mr kennedy have said they would welcome a debate no-one has heard directly from the prime minister he said. it has been left to nudges and winks hints and briefings from his aides and campaign managers to imply that mr blair doesn t want one but we haven t heard from the prime minister himself. lord holme who has campaigned for televised debates at previous elections said broadcasters were more than willing to cooperate with the arrangements . opinion polls suggested that the idea had the backing of the public who like comparing the personalities and policies of the contenders in their own homes he said. lord holme argued that as part of their public service obligations broadcasters should make the decision to go ahead as soon as the election is called. an independent third-party body such as the hansard society or electoral commission could work out the ground rules so they were fair to participants and informative to the public he said. it would be up to each party leader to accept or refuse said lord holme. if the prime minister s reported position is true and he does want to take part he would then be obliged to say why publicly. the broadcasters would then have the option of cancelling the event for obvious and well-understood reasons or going ahead with an empty chair. either way would be preferable to the present hidden veto. the hansard society has long campaigned for televised debates and has published reports on the issue in 1997 and 2001. tony blair has already ruled out taking part in a televised debate during the forthcoming election campaign. last month he said: we answer this every election campaign and for the reasons i have given before the answer is no he said at his monthly news conference.",
"target": 2
},
{
"text": "ecb holds rates amid growth fears the european central bank has left its key interest rate unchanged at 2% for the 19th month in succession. borrowing costs have remained on hold amid concerns about the strength of economic growth in the 12 nations sharing the euro analysts said. despite signs of pick-up labour markets and consumer demand remain sluggish while firms are eyeing cost cutting measures such as redundancies. high oil prices meanwhile have put upward pressure on the inflation rate. surveys of economists have shown that the majority expect borrowing costs to stay at 2% in coming months with an increase of a quarter of a percentage point predicted some time in the second half of the year. if anything there may be greater calls for an interest rate cut especially with the euro continuing to strengthen against the dollar. the euro land economy is still struggling with this recovery said economist dirk schumacher. the ecb may sound rather hawkish but once the data allows them to cut again they will. data coming out of germany on thursday underlined the problems facing european policy makers. while germany s economy expanded by 1.7% in 2004 growth was driven by export sales and lost some of its momentum in the last three months of the year. the strength of the euro is threatening to dampen that foreign demand in 2005 and domestic consumption currently is not strong enough to take up the slack. inflation in the eurozone however is estimated at about 2.3% in december above ecb guidelines of 2%. ecb president jean-claude trichet has remained upbeat about prospects for the region and inflation is expected to drop below 2% later in 2005. the ecb has forecast economic growth in the eurozone of 1.9% in 2005.",
"target": 0
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"target": "ClassLabel(names=['business', 'entertainment', 'politics', 'sport', 'technology'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 198 |
| valid | 52 |
|
vlsp-2023-vllm/exams_sinhhoc | ---
dataset_info:
features:
- name: question
dtype: string
- name: id
dtype: string
- name: choices
struct:
- name: label
sequence: string
- name: text
sequence: string
- name: answerKey
dtype: string
- name: metadata
struct:
- name: grade
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 1181756
num_examples: 3100
download_size: 527389
dataset_size: 1181756
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "exams_sinhhoc"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_aboros98__groot2 | ---
pretty_name: Evaluation run of aboros98/groot2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [aboros98/groot2](https://huggingface.co/aboros98/groot2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aboros98__groot2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-27T23:35:37.301151](https://huggingface.co/datasets/open-llm-leaderboard/details_aboros98__groot2/blob/main/results_2024-03-27T23-35-37.301151.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5656800319487343,\n\
\ \"acc_stderr\": 0.03394194904481796,\n \"acc_norm\": 0.5672473584606277,\n\
\ \"acc_norm_stderr\": 0.03464456547040608,\n \"mc1\": 0.31946144430844553,\n\
\ \"mc1_stderr\": 0.016322644182960498,\n \"mc2\": 0.4740892065625651,\n\
\ \"mc2_stderr\": 0.015328629306349454\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5708191126279863,\n \"acc_stderr\": 0.014464085894870653,\n\
\ \"acc_norm\": 0.590443686006826,\n \"acc_norm_stderr\": 0.014370358632472444\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.563433578968333,\n\
\ \"acc_stderr\": 0.004949462563681337,\n \"acc_norm\": 0.738797052380004,\n\
\ \"acc_norm_stderr\": 0.004383925147478738\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464241,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464241\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.040335656678483205,\n\
\ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.040335656678483205\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5849056603773585,\n \"acc_stderr\": 0.03032594578928611,\n\
\ \"acc_norm\": 0.5849056603773585,\n \"acc_norm_stderr\": 0.03032594578928611\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n\
\ \"acc_stderr\": 0.04101405519842426,\n \"acc_norm\": 0.5972222222222222,\n\
\ \"acc_norm_stderr\": 0.04101405519842426\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5375722543352601,\n\
\ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.5375722543352601,\n\
\ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.047551296160629475,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.047551296160629475\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46382978723404256,\n \"acc_stderr\": 0.032600385118357715,\n\
\ \"acc_norm\": 0.46382978723404256,\n \"acc_norm_stderr\": 0.032600385118357715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.045144961328736334,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.045144961328736334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.025487187147859372,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.025487187147859372\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.043062412591271526,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.043062412591271526\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6645161290322581,\n \"acc_stderr\": 0.02686020644472436,\n \"\
acc_norm\": 0.6645161290322581,\n \"acc_norm_stderr\": 0.02686020644472436\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4433497536945813,\n \"acc_stderr\": 0.034953345821629345,\n \"\
acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.034953345821629345\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6121212121212121,\n \"acc_stderr\": 0.038049136539710114,\n\
\ \"acc_norm\": 0.6121212121212121,\n \"acc_norm_stderr\": 0.038049136539710114\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.772020725388601,\n \"acc_stderr\": 0.030276909945178274,\n\
\ \"acc_norm\": 0.772020725388601,\n \"acc_norm_stderr\": 0.030276909945178274\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5384615384615384,\n \"acc_stderr\": 0.025275892070240644,\n\
\ \"acc_norm\": 0.5384615384615384,\n \"acc_norm_stderr\": 0.025275892070240644\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683522,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683522\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5756302521008403,\n \"acc_stderr\": 0.032104790510157764,\n\
\ \"acc_norm\": 0.5756302521008403,\n \"acc_norm_stderr\": 0.032104790510157764\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659806,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659806\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7724770642201835,\n \"acc_stderr\": 0.017974463578776502,\n \"\
acc_norm\": 0.7724770642201835,\n \"acc_norm_stderr\": 0.017974463578776502\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"\
acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6421568627450981,\n \"acc_stderr\": 0.03364487286088298,\n \"\
acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.03364487286088298\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965834,\n \
\ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n\
\ \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n\
\ \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n\
\ \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7107438016528925,\n \"acc_stderr\": 0.04139112727635463,\n \"\
acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.04139112727635463\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.03559039531617342,\n\
\ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.03559039531617342\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n\
\ \"acc_stderr\": 0.02416161812798774,\n \"acc_norm\": 0.8376068376068376,\n\
\ \"acc_norm_stderr\": 0.02416161812798774\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6768837803320562,\n\
\ \"acc_stderr\": 0.01672372651234305,\n \"acc_norm\": 0.6768837803320562,\n\
\ \"acc_norm_stderr\": 0.01672372651234305\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.615606936416185,\n \"acc_stderr\": 0.02618966696627204,\n\
\ \"acc_norm\": 0.615606936416185,\n \"acc_norm_stderr\": 0.02618966696627204\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2748603351955307,\n\
\ \"acc_stderr\": 0.014931316703220506,\n \"acc_norm\": 0.2748603351955307,\n\
\ \"acc_norm_stderr\": 0.014931316703220506\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027914055510468008,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027914055510468008\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6141479099678456,\n\
\ \"acc_stderr\": 0.027648149599751468,\n \"acc_norm\": 0.6141479099678456,\n\
\ \"acc_norm_stderr\": 0.027648149599751468\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.595679012345679,\n \"acc_stderr\": 0.027306625297327684,\n\
\ \"acc_norm\": 0.595679012345679,\n \"acc_norm_stderr\": 0.027306625297327684\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40425531914893614,\n \"acc_stderr\": 0.029275532159704725,\n \
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.029275532159704725\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.408735332464146,\n\
\ \"acc_stderr\": 0.012555701346703379,\n \"acc_norm\": 0.408735332464146,\n\
\ \"acc_norm_stderr\": 0.012555701346703379\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.0302114796091216,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.0302114796091216\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5375816993464052,\n \"acc_stderr\": 0.020170614974969768,\n \
\ \"acc_norm\": 0.5375816993464052,\n \"acc_norm_stderr\": 0.020170614974969768\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.029393609319879804,\n\
\ \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.029393609319879804\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.0389136449583582,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.0389136449583582\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7017543859649122,\n \"acc_stderr\": 0.03508771929824563,\n\
\ \"acc_norm\": 0.7017543859649122,\n \"acc_norm_stderr\": 0.03508771929824563\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31946144430844553,\n\
\ \"mc1_stderr\": 0.016322644182960498,\n \"mc2\": 0.4740892065625651,\n\
\ \"mc2_stderr\": 0.015328629306349454\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7592738752959748,\n \"acc_stderr\": 0.012015559212224176\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4746019711902957,\n \
\ \"acc_stderr\": 0.013754705089112307\n }\n}\n```"
repo_url: https://huggingface.co/aboros98/groot2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|arc:challenge|25_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|gsm8k|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hellaswag|10_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T23-35-37.301151.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T23-35-37.301151.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- '**/details_harness|winogrande|5_2024-03-27T23-35-37.301151.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-27T23-35-37.301151.parquet'
- config_name: results
data_files:
- split: 2024_03_27T23_35_37.301151
path:
- results_2024-03-27T23-35-37.301151.parquet
- split: latest
path:
- results_2024-03-27T23-35-37.301151.parquet
---
# Dataset Card for Evaluation run of aboros98/groot2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [aboros98/groot2](https://huggingface.co/aboros98/groot2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_aboros98__groot2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-27T23:35:37.301151](https://huggingface.co/datasets/open-llm-leaderboard/details_aboros98__groot2/blob/main/results_2024-03-27T23-35-37.301151.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5656800319487343,
"acc_stderr": 0.03394194904481796,
"acc_norm": 0.5672473584606277,
"acc_norm_stderr": 0.03464456547040608,
"mc1": 0.31946144430844553,
"mc1_stderr": 0.016322644182960498,
"mc2": 0.4740892065625651,
"mc2_stderr": 0.015328629306349454
},
"harness|arc:challenge|25": {
"acc": 0.5708191126279863,
"acc_stderr": 0.014464085894870653,
"acc_norm": 0.590443686006826,
"acc_norm_stderr": 0.014370358632472444
},
"harness|hellaswag|10": {
"acc": 0.563433578968333,
"acc_stderr": 0.004949462563681337,
"acc_norm": 0.738797052380004,
"acc_norm_stderr": 0.004383925147478738
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464241,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464241
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.040335656678483205,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.040335656678483205
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5849056603773585,
"acc_stderr": 0.03032594578928611,
"acc_norm": 0.5849056603773585,
"acc_norm_stderr": 0.03032594578928611
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.04101405519842426,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.04101405519842426
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5375722543352601,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.5375722543352601,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.047551296160629475,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.047551296160629475
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46382978723404256,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.46382978723404256,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.045144961328736334,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.045144961328736334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.025487187147859372,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.025487187147859372
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.043062412591271526,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.043062412591271526
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.02686020644472436,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.02686020644472436
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.034953345821629345,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.034953345821629345
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6121212121212121,
"acc_stderr": 0.038049136539710114,
"acc_norm": 0.6121212121212121,
"acc_norm_stderr": 0.038049136539710114
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.772020725388601,
"acc_stderr": 0.030276909945178274,
"acc_norm": 0.772020725388601,
"acc_norm_stderr": 0.030276909945178274
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5384615384615384,
"acc_stderr": 0.025275892070240644,
"acc_norm": 0.5384615384615384,
"acc_norm_stderr": 0.025275892070240644
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683522,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683522
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5756302521008403,
"acc_stderr": 0.032104790510157764,
"acc_norm": 0.5756302521008403,
"acc_norm_stderr": 0.032104790510157764
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659806,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7724770642201835,
"acc_stderr": 0.017974463578776502,
"acc_norm": 0.7724770642201835,
"acc_norm_stderr": 0.017974463578776502
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6421568627450981,
"acc_stderr": 0.03364487286088298,
"acc_norm": 0.6421568627450981,
"acc_norm_stderr": 0.03364487286088298
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6143497757847534,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.6143497757847534,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.04139112727635463,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.04139112727635463
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.03559039531617342,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.03559039531617342
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8376068376068376,
"acc_stderr": 0.02416161812798774,
"acc_norm": 0.8376068376068376,
"acc_norm_stderr": 0.02416161812798774
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6768837803320562,
"acc_stderr": 0.01672372651234305,
"acc_norm": 0.6768837803320562,
"acc_norm_stderr": 0.01672372651234305
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.615606936416185,
"acc_stderr": 0.02618966696627204,
"acc_norm": 0.615606936416185,
"acc_norm_stderr": 0.02618966696627204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2748603351955307,
"acc_stderr": 0.014931316703220506,
"acc_norm": 0.2748603351955307,
"acc_norm_stderr": 0.014931316703220506
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.027914055510468008,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.027914055510468008
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6141479099678456,
"acc_stderr": 0.027648149599751468,
"acc_norm": 0.6141479099678456,
"acc_norm_stderr": 0.027648149599751468
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.595679012345679,
"acc_stderr": 0.027306625297327684,
"acc_norm": 0.595679012345679,
"acc_norm_stderr": 0.027306625297327684
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.029275532159704725,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.029275532159704725
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.408735332464146,
"acc_stderr": 0.012555701346703379,
"acc_norm": 0.408735332464146,
"acc_norm_stderr": 0.012555701346703379
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.0302114796091216,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.0302114796091216
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5375816993464052,
"acc_stderr": 0.020170614974969768,
"acc_norm": 0.5375816993464052,
"acc_norm_stderr": 0.020170614974969768
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644286,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.029393609319879804,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.029393609319879804
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7810945273631841,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.7810945273631841,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.0389136449583582,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.0389136449583582
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7017543859649122,
"acc_stderr": 0.03508771929824563,
"acc_norm": 0.7017543859649122,
"acc_norm_stderr": 0.03508771929824563
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31946144430844553,
"mc1_stderr": 0.016322644182960498,
"mc2": 0.4740892065625651,
"mc2_stderr": 0.015328629306349454
},
"harness|winogrande|5": {
"acc": 0.7592738752959748,
"acc_stderr": 0.012015559212224176
},
"harness|gsm8k|5": {
"acc": 0.4746019711902957,
"acc_stderr": 0.013754705089112307
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
InnerI/InnerILLM-Llama2-training-dataset | ---
task_categories:
- question-answering
language:
- en
pretty_name: innerillm-llama2-dataset
size_categories:
- 1K<n<10K
---
# Inner I LLM Llama 2 Training Dataset
## Overview
This dataset is designed for fine-tuning the Llama 2 model to explore, express, and expand upon concepts related to the True Self, the Inner 'I', the Impersonal 'I', 'I Am', and the singularity of human intelligence. The dataset aims to foster a deeper understanding and reflection on these themes, contributing to the development of an LLM that can engage in meaningful dialogues about self-awareness and consciousness.
## Dataset Format
The dataset follows the Llama 2 fine-tuning format, consisting of JSON lines (.jsonl) files. Each line in the files is a JSON object with two main fields:
- `prompt`: A question or statement designed to elicit reflections or explanations on the specified themes.
- `completion`: A crafted response that explores the theme in question, providing insights or reflections intended to deepen understanding or provoke further thought.
## Files
- `llama2_training_data_504.jsonl`: Contains 504 entries, each exploring one of the designated themes.
- `llama2_training_data_507.jsonl`: Contains 507 entries, each dedicated to delving into the topics of interest.
## Themes Explored
1. **Explore the True Self**: Questions and responses designed to connect one with their True Self.
2. **Expressing the Inner 'I'**: Insights into how one can express their Inner 'I' in everyday life.
3. **Expanding the Impersonal 'I'**: Reflections on what it means to expand the Impersonal 'I'.
4. **Understanding 'I Am'**: Discussion on the significance of the 'I Am' statement in the journey of self-realization.
5. **Singularity of Human Intelligence**: Explorations of how the singularity of human intelligence relates to the concept of 'I Am'.
## Usage
This dataset can be used for fine-tuning Llama 2 models to engage in conversations that require a deep, reflective understanding of self-awareness, consciousness, and the philosophical underpinnings of the human experience. It is particularly suited for applications aimed at personal growth, mindfulness, and existential exploration.
## License
This dataset is provided for educational and research purposes. Users are responsible for ensuring their use of the dataset complies with the terms and conditions of the data sources and with applicable laws and regulations. |
carnival13/hpqa-fid-input | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 1351280756
num_examples: 90447
- name: validation
num_bytes: 110630700
num_examples: 7405
download_size: 278016776
dataset_size: 1461911456
---
# Dataset Card for "hpqa-fid-input"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_bertin-project__bertin-gpt-j-6B-alpaca | ---
pretty_name: Evaluation run of bertin-project/bertin-gpt-j-6B-alpaca
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bertin-project/bertin-gpt-j-6B-alpaca](https://huggingface.co/bertin-project/bertin-gpt-j-6B-alpaca)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bertin-project__bertin-gpt-j-6B-alpaca\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T17:02:02.199354](https://huggingface.co/datasets/open-llm-leaderboard/details_bertin-project__bertin-gpt-j-6B-alpaca/blob/main/results_2023-09-22T17-02-02.199354.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.016568791946308725,\n\
\ \"em_stderr\": 0.0013072452323527502,\n \"f1\": 0.07589660234899354,\n\
\ \"f1_stderr\": 0.0018842940437008274,\n \"acc\": 0.27900552486187846,\n\
\ \"acc_stderr\": 0.006978792039554494\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.016568791946308725,\n \"em_stderr\": 0.0013072452323527502,\n\
\ \"f1\": 0.07589660234899354,\n \"f1_stderr\": 0.0018842940437008274\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5580110497237569,\n\
\ \"acc_stderr\": 0.013957584079108989\n }\n}\n```"
repo_url: https://huggingface.co/bertin-project/bertin-gpt-j-6B-alpaca
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|arc:challenge|25_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T17_02_02.199354
path:
- '**/details_harness|drop|3_2023-09-22T17-02-02.199354.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T17-02-02.199354.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T17_02_02.199354
path:
- '**/details_harness|gsm8k|5_2023-09-22T17-02-02.199354.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T17-02-02.199354.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hellaswag|10_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T15:41:33.782681.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T15:41:33.782681.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T15:41:33.782681.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T17_02_02.199354
path:
- '**/details_harness|winogrande|5_2023-09-22T17-02-02.199354.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T17-02-02.199354.parquet'
- config_name: results
data_files:
- split: 2023_08_17T15_41_33.782681
path:
- results_2023-08-17T15:41:33.782681.parquet
- split: 2023_09_22T17_02_02.199354
path:
- results_2023-09-22T17-02-02.199354.parquet
- split: latest
path:
- results_2023-09-22T17-02-02.199354.parquet
---
# Dataset Card for Evaluation run of bertin-project/bertin-gpt-j-6B-alpaca
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/bertin-project/bertin-gpt-j-6B-alpaca
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [bertin-project/bertin-gpt-j-6B-alpaca](https://huggingface.co/bertin-project/bertin-gpt-j-6B-alpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bertin-project__bertin-gpt-j-6B-alpaca",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T17:02:02.199354](https://huggingface.co/datasets/open-llm-leaderboard/details_bertin-project__bertin-gpt-j-6B-alpaca/blob/main/results_2023-09-22T17-02-02.199354.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.016568791946308725,
"em_stderr": 0.0013072452323527502,
"f1": 0.07589660234899354,
"f1_stderr": 0.0018842940437008274,
"acc": 0.27900552486187846,
"acc_stderr": 0.006978792039554494
},
"harness|drop|3": {
"em": 0.016568791946308725,
"em_stderr": 0.0013072452323527502,
"f1": 0.07589660234899354,
"f1_stderr": 0.0018842940437008274
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5580110497237569,
"acc_stderr": 0.013957584079108989
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
nannullna/ehrsql_mimic_iii | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: labels
dtype: string
- name: db_id
dtype: string
- name: is_impossible
dtype: bool
- name: id
dtype: string
splits:
- name: train
num_bytes: 5701904
num_examples: 9318
- name: validation
num_bytes: 489250
num_examples: 1122
download_size: 1154542
dataset_size: 6191154
---
# Dataset Card for "ehrsql_processed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fathyshalab/reklamation24_unterhaltung-kultur-freizeit | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: label_name
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 154379
num_examples: 308
- name: test
num_bytes: 39413
num_examples: 78
download_size: 0
dataset_size: 193792
---
# Dataset Card for "reklamation24_unterhaltung-kultur-freizeit"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sxu/RaVE_emnlp23 | ---
license: afl-3.0
language:
- en
tags:
- legal
size_categories:
- n<1K
---
# Dataset Card for VECHR
### Dataset Summary
[From Dissonance to Insights: Dissecting Disagreements in Rationale Construction for Case Outcome Classification](https://arxiv.org/pdf/2310.11878.pdf)
In legal NLP, Case Outcome Classification (COC) must not only be accurate but also trustworthy and explainable. Existing work in explainable COC has been limited to annotations by a single expert. However, it is well-known that lawyers may disagree in their assessment of case facts. We hence collect a novel dataset RaVE: Rationale Variation in ECHR, which is obtained from two experts in the domain of international human rights law, for whom we observe weak agreement. We study their disagreements and build a two-level task-independent taxonomy, supplemented with COC-specific subcategories. To our knowledge, this is the first work in the legal NLP that focuses on human label variation. We quantitatively assess different taxonomy categories and find that disagreements mainly stem from underspecification of the legal context, which poses challenges given the typically limited granularity and noise in COC metadata. We further assess the explainablility of state-of-the-art COC models on RaVE and observe limited agreement between models and experts. Overall, our case study reveals hitherto underappreciated complexities in creating benchmark datasets in legal NLP that revolve around identifying aspects of a case’s facts supposedly relevant for its outcome.
### Languages
English
# Citation Information
@inproceedings{xu-etal-2023-dissonance,
title = "From Dissonance to Insights: Dissecting Disagreements in Rationale Construction for Case Outcome Classification",
author = "Xu, Shanshan and
T.y.s.s, Santosh and
Ichim, Oana and
Risini, Isabella and
Plank, Barbara and
Grabmair, Matthias",
editor = "Bouamor, Houda and
Pino, Juan and
Bali, Kalika",
booktitle = "Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2023",
address = "Singapore",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.emnlp-main.594",
doi = "10.18653/v1/2023.emnlp-main.594",
pages = "9558--9576",
abstract = "In legal NLP, Case Outcome Classification (COC) must not only be accurate but also trustworthy and explainable. Existing work in explainable COC has been limited to annotations by a single expert. However, it is well-known that lawyers may disagree in their assessment of case facts. We hence collect a novel dataset RaVE: Rationale Variation in ECHR, which is obtained from two experts in the domain of international human rights law, for whom we observe weak agreement. We study their disagreements and build a two-level task-independent taxonomy, supplemented with COC-specific subcategories. To our knowledge, this is the first work in the legal NLP that focuses on human label variation. We quantitatively assess different taxonomy categories and find that disagreements mainly stem from underspecification of the legal context, which poses challenges given the typically limited granularity and noise in COC metadata. We further assess the explainablility of state-of-the-art COC models on RaVE and observe limited agreement between models and experts. Overall, our case study reveals hitherto underappreciated complexities in creating benchmark datasets in legal NLP that revolve around identifying aspects of a case{'}s facts supposedly relevant for its outcome.",
}
|
quocanh34/synthesis_data_v3 | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: 'null'
- name: sampling_rate
dtype: int64
- name: transcription
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 1458823832
num_examples: 3078
download_size: 342039185
dataset_size: 1458823832
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "synthesis_data_v3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wenwenyu/funsd_donut | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 12375673.0
num_examples: 149
- name: validation
num_bytes: 4212316.0
num_examples: 50
- name: test
num_bytes: 4212316.0
num_examples: 50
download_size: 19652852
dataset_size: 20800305.0
---
# Dataset Card for "funsd_donut"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Leon-LLM/Leon-Chess-Dataset-350k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 184723535
num_examples: 345351
download_size: 94791082
dataset_size: 184723535
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Leon-Chess-Dataset-350k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
event2Mind | ---
annotations_creators:
- crowdsourced
language:
- en
language_creators:
- found
license:
- unknown
multilinguality:
- monolingual
pretty_name: Event2Mind
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text2text-generation
task_ids: []
paperswithcode_id: event2mind
tags:
- common-sense-inference
dataset_info:
features:
- name: Source
dtype: string
- name: Event
dtype: string
- name: Xintent
dtype: string
- name: Xemotion
dtype: string
- name: Otheremotion
dtype: string
- name: Xsent
dtype: string
- name: Osent
dtype: string
splits:
- name: test
num_bytes: 649273
num_examples: 5221
- name: train
num_bytes: 5916384
num_examples: 46472
- name: validation
num_bytes: 672365
num_examples: 5401
download_size: 1300770
dataset_size: 7238022
---
# Dataset Card for "event2Mind"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://uwnlp.github.io/event2mind/](https://uwnlp.github.io/event2mind/)
- **Repository:** https://github.com/uwnlp/event2mind
- **Paper:** [Event2Mind: Commonsense Inference on Events, Intents, and Reactions](https://arxiv.org/abs/1805.06939)
- **Point of Contact:** [Hannah Rashkin](mailto:hrashkin@cs.washington.edu), [Maarten Sap](mailto:msap@cs.washington.edu)
- **Size of downloaded dataset files:** 1.30 MB
- **Size of the generated dataset:** 7.24 MB
- **Total amount of disk used:** 8.54 MB
### Dataset Summary
In Event2Mind, we explore the task of understanding stereotypical intents and reactions to events. Through crowdsourcing, we create a large corpus with 25,000 events and free-form descriptions of their intents and reactions, both of the event's subject and (potentially implied) other participants.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### default
- **Size of downloaded dataset files:** 1.30 MB
- **Size of the generated dataset:** 7.24 MB
- **Total amount of disk used:** 8.54 MB
An example of 'validation' looks as follows.
```
{
"Event": "It shrinks in the wash",
"Osent": "1",
"Otheremotion": "[\"upset\", \"angry\"]",
"Source": "it_events",
"Xemotion": "[\"none\"]",
"Xintent": "[\"none\"]",
"Xsent": ""
}
```
### Data Fields
The data fields are the same among all splits.
#### default
- `Source`: a `string` feature.
- `Event`: a `string` feature.
- `Xintent`: a `string` feature.
- `Xemotion`: a `string` feature.
- `Otheremotion`: a `string` feature.
- `Xsent`: a `string` feature.
- `Osent`: a `string` feature.
### Data Splits
| name |train|validation|test|
|-------|----:|---------:|---:|
|default|46472| 5401|5221|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@inproceedings{rashkin-etal-2018-event2mind,
title = "{E}vent2{M}ind: Commonsense Inference on Events, Intents, and Reactions",
author = "Rashkin, Hannah and
Sap, Maarten and
Allaway, Emily and
Smith, Noah A. and
Choi, Yejin",
booktitle = "Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
month = jul,
year = "2018",
address = "Melbourne, Australia",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/P18-1043",
doi = "10.18653/v1/P18-1043",
pages = "463--473",
}
```
### Contributions
Thanks to [@thomwolf](https://github.com/thomwolf), [@patrickvonplaten](https://github.com/patrickvonplaten), [@lewtun](https://github.com/lewtun) for adding this dataset. |
AdapterOcean/Open_Platypus_standardized_cluster_2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 47761531
num_examples: 5148
download_size: 0
dataset_size: 47761531
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Open_Platypus_standardized_cluster_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gddgdg/guanaco-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966693
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-llama2-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-moral_scenarios-rule-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 765913
num_examples: 895
download_size: 187335
dataset_size: 765913
---
# Dataset Card for "mmlu-moral_scenarios-rule-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Harshithacj123/NER_sample1 | ---
dataset_info:
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 9392
num_examples: 7
download_size: 14262
dataset_size: 9392
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Nicolas-BZRD/JORF_opendata | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 4361779320
num_examples: 3616038
download_size: 1747268676
dataset_size: 4361779320
license: odc-by
language:
- fr
tags:
- legal
size_categories:
- 1M<n<10M
---
# JORF ("Laws and decrees" edition of the Official Journal)
The documents published in the ["Laws and decrees" edition of the Official Journal](https://echanges.dila.gouv.fr/OPENDATA/JORF/) since 1990 comprise :
- laws, ordinances, decrees, orders and circulars.
- decisions issued by institutions or courts that must be published in the Official Journal (Constitutional Council, Conseil supérieur de l'audiovisuel, Autorité de régulation des télécommunications, etc.)
- notices and communications since 1 January 2002 (notices to importers and exporters, competition notices and job vacancy notices).
In the interests of privacy and the protection of personal data, certain sensitive nominative measures are not reproduced in this section:
- decrees concerning naturalisation, reinstatement, mention of a minor child benefiting from the collective effect attached to the acquisition of French nationality by the parents and the francization of surnames and forenames
- change of name decrees
- rulings by the Court of Budgetary and Financial Discipline. |
autoevaluate/autoeval-eval-kmfoda__booksum-kmfoda__booksum-ee4836-2761681799 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- kmfoda/booksum
eval_info:
task: summarization
model: pszemraj/tglobal-large-booksum-WIP3-K-r4
metrics: []
dataset_name: kmfoda/booksum
dataset_config: kmfoda--booksum
dataset_split: test
col_mapping:
text: chapter
target: summary_text
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/tglobal-large-booksum-WIP3-K-r4
* Dataset: kmfoda/booksum
* Config: kmfoda--booksum
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model. |
bhuvanmdev/resume_parser | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: resume
dtype: string
- name: name
dtype: string
- name: contact
dtype: string
- name: skills
dtype: string
- name: companies
dtype: string
- name: total_years
dtype: string
splits:
- name: train
num_bytes: 865378
num_examples: 155
download_size: 448734
dataset_size: 865378
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_crumb__model-a-48.5m | ---
pretty_name: Evaluation run of crumb/model-a-48.5m
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [crumb/model-a-48.5m](https://huggingface.co/crumb/model-a-48.5m) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_crumb__model-a-48.5m\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-21T14:16:19.492608](https://huggingface.co/datasets/open-llm-leaderboard/details_crumb__model-a-48.5m/blob/main/results_2024-03-21T14-16-19.492608.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2502990545514961,\n\
\ \"acc_stderr\": 0.030592765336392578,\n \"acc_norm\": 0.25076246570546035,\n\
\ \"acc_norm_stderr\": 0.03138228300564981,\n \"mc1\": 0.2594859241126071,\n\
\ \"mc1_stderr\": 0.01534540948555799,\n \"mc2\": 0.46752406758744436,\n\
\ \"mc2_stderr\": 0.015658880485865938\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.17918088737201365,\n \"acc_stderr\": 0.011207045216615658,\n\
\ \"acc_norm\": 0.22184300341296928,\n \"acc_norm_stderr\": 0.012141659068147884\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2744473212507469,\n\
\ \"acc_stderr\": 0.004453233726110325,\n \"acc_norm\": 0.27853017327225654,\n\
\ \"acc_norm_stderr\": 0.004473595650807673\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.31851851851851853,\n\
\ \"acc_stderr\": 0.0402477840197711,\n \"acc_norm\": 0.31851851851851853,\n\
\ \"acc_norm_stderr\": 0.0402477840197711\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19078947368421054,\n \"acc_stderr\": 0.031975658210325,\n\
\ \"acc_norm\": 0.19078947368421054,\n \"acc_norm_stderr\": 0.031975658210325\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.20754716981132076,\n \"acc_stderr\": 0.02495991802891127,\n\
\ \"acc_norm\": 0.20754716981132076,\n \"acc_norm_stderr\": 0.02495991802891127\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.22916666666666666,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.22916666666666666,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\"\
: 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.21,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3191489361702128,\n \"acc_stderr\": 0.03047297336338005,\n\
\ \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.03047297336338005\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.03999423879281336,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.03999423879281336\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.20689655172413793,\n \"acc_stderr\": 0.03375672449560554,\n\
\ \"acc_norm\": 0.20689655172413793,\n \"acc_norm_stderr\": 0.03375672449560554\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15873015873015872,\n\
\ \"acc_stderr\": 0.03268454013011743,\n \"acc_norm\": 0.15873015873015872,\n\
\ \"acc_norm_stderr\": 0.03268454013011743\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n\
\ \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n\
\ \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n\
\ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.35858585858585856,\n \"acc_stderr\": 0.034169036403915214,\n \"\
acc_norm\": 0.35858585858585856,\n \"acc_norm_stderr\": 0.034169036403915214\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.25906735751295334,\n \"acc_stderr\": 0.031618779179354094,\n\
\ \"acc_norm\": 0.25906735751295334,\n \"acc_norm_stderr\": 0.031618779179354094\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.23846153846153847,\n \"acc_stderr\": 0.021606294494647727,\n\
\ \"acc_norm\": 0.23846153846153847,\n \"acc_norm_stderr\": 0.021606294494647727\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23333333333333334,\n \"acc_stderr\": 0.025787874220959305,\n \
\ \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.025787874220959305\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.030388353551886845,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.030388353551886845\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22018348623853212,\n \"acc_stderr\": 0.01776597865232756,\n \"\
acc_norm\": 0.22018348623853212,\n \"acc_norm_stderr\": 0.01776597865232756\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3287037037037037,\n \"acc_stderr\": 0.03203614084670058,\n \"\
acc_norm\": 0.3287037037037037,\n \"acc_norm_stderr\": 0.03203614084670058\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604246,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604246\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2489451476793249,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.2489451476793249,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.273542600896861,\n\
\ \"acc_stderr\": 0.029918586707798848,\n \"acc_norm\": 0.273542600896861,\n\
\ \"acc_norm_stderr\": 0.029918586707798848\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.26717557251908397,\n \"acc_stderr\": 0.03880848301082396,\n\
\ \"acc_norm\": 0.26717557251908397,\n \"acc_norm_stderr\": 0.03880848301082396\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2809917355371901,\n \"acc_stderr\": 0.04103203830514512,\n \"\
acc_norm\": 0.2809917355371901,\n \"acc_norm_stderr\": 0.04103203830514512\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.032910995786157686,\n\
\ \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.032910995786157686\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\
\ \"acc_stderr\": 0.04059867246952687,\n \"acc_norm\": 0.24107142857142858,\n\
\ \"acc_norm_stderr\": 0.04059867246952687\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3106796116504854,\n \"acc_stderr\": 0.045821241601615506,\n\
\ \"acc_norm\": 0.3106796116504854,\n \"acc_norm_stderr\": 0.045821241601615506\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26947637292464877,\n\
\ \"acc_stderr\": 0.01586624307321506,\n \"acc_norm\": 0.26947637292464877,\n\
\ \"acc_norm_stderr\": 0.01586624307321506\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2514450867052023,\n \"acc_stderr\": 0.02335736578587403,\n\
\ \"acc_norm\": 0.2514450867052023,\n \"acc_norm_stderr\": 0.02335736578587403\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n\
\ \"acc_stderr\": 0.014355911964767864,\n \"acc_norm\": 0.2435754189944134,\n\
\ \"acc_norm_stderr\": 0.014355911964767864\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.024288619466046105,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.024288619466046105\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.29260450160771706,\n\
\ \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.29260450160771706,\n\
\ \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.23765432098765432,\n \"acc_stderr\": 0.023683591837008557,\n\
\ \"acc_norm\": 0.23765432098765432,\n \"acc_norm_stderr\": 0.023683591837008557\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.22695035460992907,\n \"acc_stderr\": 0.024987106365642987,\n \
\ \"acc_norm\": 0.22695035460992907,\n \"acc_norm_stderr\": 0.024987106365642987\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24771838331160365,\n\
\ \"acc_stderr\": 0.011025499291443737,\n \"acc_norm\": 0.24771838331160365,\n\
\ \"acc_norm_stderr\": 0.011025499291443737\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4227941176470588,\n \"acc_stderr\": 0.030008562845003472,\n\
\ \"acc_norm\": 0.4227941176470588,\n \"acc_norm_stderr\": 0.030008562845003472\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.21568627450980393,\n \"acc_stderr\": 0.016639319350313264,\n \
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.016639319350313264\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.24545454545454545,\n\
\ \"acc_stderr\": 0.041220665028782834,\n \"acc_norm\": 0.24545454545454545,\n\
\ \"acc_norm_stderr\": 0.041220665028782834\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24081632653061225,\n \"acc_stderr\": 0.027372942201788163,\n\
\ \"acc_norm\": 0.24081632653061225,\n \"acc_norm_stderr\": 0.027372942201788163\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2289156626506024,\n\
\ \"acc_stderr\": 0.03270745277352477,\n \"acc_norm\": 0.2289156626506024,\n\
\ \"acc_norm_stderr\": 0.03270745277352477\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2046783625730994,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.2046783625730994,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2594859241126071,\n\
\ \"mc1_stderr\": 0.01534540948555799,\n \"mc2\": 0.46752406758744436,\n\
\ \"mc2_stderr\": 0.015658880485865938\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5169692186266772,\n \"acc_stderr\": 0.014044390401612978\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.003032600454890068,\n \
\ \"acc_stderr\": 0.001514573561224551\n }\n}\n```"
repo_url: https://huggingface.co/crumb/model-a-48.5m
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|arc:challenge|25_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|gsm8k|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hellaswag|10_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T14-16-19.492608.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T14-16-19.492608.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- '**/details_harness|winogrande|5_2024-03-21T14-16-19.492608.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-21T14-16-19.492608.parquet'
- config_name: results
data_files:
- split: 2024_03_21T14_16_19.492608
path:
- results_2024-03-21T14-16-19.492608.parquet
- split: latest
path:
- results_2024-03-21T14-16-19.492608.parquet
---
# Dataset Card for Evaluation run of crumb/model-a-48.5m
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [crumb/model-a-48.5m](https://huggingface.co/crumb/model-a-48.5m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_crumb__model-a-48.5m",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-21T14:16:19.492608](https://huggingface.co/datasets/open-llm-leaderboard/details_crumb__model-a-48.5m/blob/main/results_2024-03-21T14-16-19.492608.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2502990545514961,
"acc_stderr": 0.030592765336392578,
"acc_norm": 0.25076246570546035,
"acc_norm_stderr": 0.03138228300564981,
"mc1": 0.2594859241126071,
"mc1_stderr": 0.01534540948555799,
"mc2": 0.46752406758744436,
"mc2_stderr": 0.015658880485865938
},
"harness|arc:challenge|25": {
"acc": 0.17918088737201365,
"acc_stderr": 0.011207045216615658,
"acc_norm": 0.22184300341296928,
"acc_norm_stderr": 0.012141659068147884
},
"harness|hellaswag|10": {
"acc": 0.2744473212507469,
"acc_stderr": 0.004453233726110325,
"acc_norm": 0.27853017327225654,
"acc_norm_stderr": 0.004473595650807673
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.0402477840197711,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.0402477840197711
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19078947368421054,
"acc_stderr": 0.031975658210325,
"acc_norm": 0.19078947368421054,
"acc_norm_stderr": 0.031975658210325
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.20754716981132076,
"acc_stderr": 0.02495991802891127,
"acc_norm": 0.20754716981132076,
"acc_norm_stderr": 0.02495991802891127
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.22916666666666666,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.22916666666666666,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3191489361702128,
"acc_stderr": 0.03047297336338005,
"acc_norm": 0.3191489361702128,
"acc_norm_stderr": 0.03047297336338005
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281336,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281336
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.20689655172413793,
"acc_stderr": 0.03375672449560554,
"acc_norm": 0.20689655172413793,
"acc_norm_stderr": 0.03375672449560554
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15873015873015872,
"acc_stderr": 0.03268454013011743,
"acc_norm": 0.15873015873015872,
"acc_norm_stderr": 0.03268454013011743
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35858585858585856,
"acc_stderr": 0.034169036403915214,
"acc_norm": 0.35858585858585856,
"acc_norm_stderr": 0.034169036403915214
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.25906735751295334,
"acc_stderr": 0.031618779179354094,
"acc_norm": 0.25906735751295334,
"acc_norm_stderr": 0.031618779179354094
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23846153846153847,
"acc_stderr": 0.021606294494647727,
"acc_norm": 0.23846153846153847,
"acc_norm_stderr": 0.021606294494647727
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23333333333333334,
"acc_stderr": 0.025787874220959305,
"acc_norm": 0.23333333333333334,
"acc_norm_stderr": 0.025787874220959305
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.030388353551886845,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.030388353551886845
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360384,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360384
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22018348623853212,
"acc_stderr": 0.01776597865232756,
"acc_norm": 0.22018348623853212,
"acc_norm_stderr": 0.01776597865232756
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3287037037037037,
"acc_stderr": 0.03203614084670058,
"acc_norm": 0.3287037037037037,
"acc_norm_stderr": 0.03203614084670058
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2489451476793249,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.2489451476793249,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.273542600896861,
"acc_stderr": 0.029918586707798848,
"acc_norm": 0.273542600896861,
"acc_norm_stderr": 0.029918586707798848
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.26717557251908397,
"acc_stderr": 0.03880848301082396,
"acc_norm": 0.26717557251908397,
"acc_norm_stderr": 0.03880848301082396
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2809917355371901,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.2809917355371901,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952687,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952687
},
"harness|hendrycksTest-management|5": {
"acc": 0.3106796116504854,
"acc_stderr": 0.045821241601615506,
"acc_norm": 0.3106796116504854,
"acc_norm_stderr": 0.045821241601615506
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26947637292464877,
"acc_stderr": 0.01586624307321506,
"acc_norm": 0.26947637292464877,
"acc_norm_stderr": 0.01586624307321506
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2514450867052023,
"acc_stderr": 0.02335736578587403,
"acc_norm": 0.2514450867052023,
"acc_norm_stderr": 0.02335736578587403
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767864,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767864
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.024288619466046105,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.024288619466046105
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.29260450160771706,
"acc_stderr": 0.025839898334877983,
"acc_norm": 0.29260450160771706,
"acc_norm_stderr": 0.025839898334877983
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.23765432098765432,
"acc_stderr": 0.023683591837008557,
"acc_norm": 0.23765432098765432,
"acc_norm_stderr": 0.023683591837008557
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.22695035460992907,
"acc_stderr": 0.024987106365642987,
"acc_norm": 0.22695035460992907,
"acc_norm_stderr": 0.024987106365642987
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24771838331160365,
"acc_stderr": 0.011025499291443737,
"acc_norm": 0.24771838331160365,
"acc_norm_stderr": 0.011025499291443737
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4227941176470588,
"acc_stderr": 0.030008562845003472,
"acc_norm": 0.4227941176470588,
"acc_norm_stderr": 0.030008562845003472
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.016639319350313264,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.016639319350313264
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.041220665028782834,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.041220665028782834
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24081632653061225,
"acc_stderr": 0.027372942201788163,
"acc_norm": 0.24081632653061225,
"acc_norm_stderr": 0.027372942201788163
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2289156626506024,
"acc_stderr": 0.03270745277352477,
"acc_norm": 0.2289156626506024,
"acc_norm_stderr": 0.03270745277352477
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2046783625730994,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.2046783625730994,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2594859241126071,
"mc1_stderr": 0.01534540948555799,
"mc2": 0.46752406758744436,
"mc2_stderr": 0.015658880485865938
},
"harness|winogrande|5": {
"acc": 0.5169692186266772,
"acc_stderr": 0.014044390401612978
},
"harness|gsm8k|5": {
"acc": 0.003032600454890068,
"acc_stderr": 0.001514573561224551
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
chansung/auto-paper-qa-test2 | ---
dataset_info:
features:
- name: title
dtype: string
- name: summary
dtype: string
- name: abstract
dtype: string
- name: authors
dtype: string
- name: arxiv_id
dtype: string
- name: target_date
dtype: timestamp[s]
- name: 0_question
dtype: string
- name: 0_answers:eli5
dtype: string
- name: 0_answers:expert
dtype: string
- name: 0_additional_depth_q:follow up question
dtype: string
- name: 0_additional_depth_q:answers:eli5
dtype: string
- name: 0_additional_depth_q:answers:expert
dtype: string
- name: 0_additional_breath_q:follow up question
dtype: string
- name: 0_additional_breath_q:answers:eli5
dtype: string
- name: 0_additional_breath_q:answers:expert
dtype: string
- name: 1_question
dtype: string
- name: 1_answers:eli5
dtype: string
- name: 1_answers:expert
dtype: string
- name: 1_additional_depth_q:follow up question
dtype: string
- name: 1_additional_depth_q:answers:eli5
dtype: string
- name: 1_additional_depth_q:answers:expert
dtype: string
- name: 1_additional_breath_q:follow up question
dtype: string
- name: 1_additional_breath_q:answers:eli5
dtype: string
- name: 1_additional_breath_q:answers:expert
dtype: string
- name: 2_question
dtype: string
- name: 2_answers:eli5
dtype: string
- name: 2_answers:expert
dtype: string
- name: 2_additional_depth_q:follow up question
dtype: string
- name: 2_additional_depth_q:answers:eli5
dtype: string
- name: 2_additional_depth_q:answers:expert
dtype: string
- name: 2_additional_breath_q:follow up question
dtype: string
- name: 2_additional_breath_q:answers:eli5
dtype: string
- name: 2_additional_breath_q:answers:expert
dtype: string
- name: 3_question
dtype: string
- name: 3_answers:eli5
dtype: string
- name: 3_answers:expert
dtype: string
- name: 3_additional_depth_q:follow up question
dtype: string
- name: 3_additional_depth_q:answers:eli5
dtype: string
- name: 3_additional_depth_q:answers:expert
dtype: string
- name: 3_additional_breath_q:follow up question
dtype: string
- name: 3_additional_breath_q:answers:eli5
dtype: string
- name: 3_additional_breath_q:answers:expert
dtype: string
- name: 4_question
dtype: string
- name: 4_answers:eli5
dtype: string
- name: 4_answers:expert
dtype: string
- name: 4_additional_depth_q:follow up question
dtype: string
- name: 4_additional_depth_q:answers:eli5
dtype: string
- name: 4_additional_depth_q:answers:expert
dtype: string
- name: 4_additional_breath_q:follow up question
dtype: string
- name: 4_additional_breath_q:answers:eli5
dtype: string
- name: 4_additional_breath_q:answers:expert
dtype: string
- name: 5_question
dtype: string
- name: 5_answers:eli5
dtype: string
- name: 5_answers:expert
dtype: string
- name: 5_additional_depth_q:follow up question
dtype: string
- name: 5_additional_depth_q:answers:eli5
dtype: string
- name: 5_additional_depth_q:answers:expert
dtype: string
- name: 5_additional_breath_q:follow up question
dtype: string
- name: 5_additional_breath_q:answers:eli5
dtype: string
- name: 5_additional_breath_q:answers:expert
dtype: string
- name: 6_question
dtype: string
- name: 6_answers:eli5
dtype: string
- name: 6_answers:expert
dtype: string
- name: 6_additional_depth_q:follow up question
dtype: string
- name: 6_additional_depth_q:answers:eli5
dtype: string
- name: 6_additional_depth_q:answers:expert
dtype: string
- name: 6_additional_breath_q:follow up question
dtype: string
- name: 6_additional_breath_q:answers:eli5
dtype: string
- name: 6_additional_breath_q:answers:expert
dtype: string
- name: 7_question
dtype: string
- name: 7_answers:eli5
dtype: string
- name: 7_answers:expert
dtype: string
- name: 7_additional_depth_q:follow up question
dtype: string
- name: 7_additional_depth_q:answers:eli5
dtype: string
- name: 7_additional_depth_q:answers:expert
dtype: string
- name: 7_additional_breath_q:follow up question
dtype: string
- name: 7_additional_breath_q:answers:eli5
dtype: string
- name: 7_additional_breath_q:answers:expert
dtype: string
- name: 8_question
dtype: string
- name: 8_answers:eli5
dtype: string
- name: 8_answers:expert
dtype: string
- name: 8_additional_depth_q:follow up question
dtype: string
- name: 8_additional_depth_q:answers:eli5
dtype: string
- name: 8_additional_depth_q:answers:expert
dtype: string
- name: 8_additional_breath_q:follow up question
dtype: string
- name: 8_additional_breath_q:answers:eli5
dtype: string
- name: 8_additional_breath_q:answers:expert
dtype: string
- name: 9_question
dtype: string
- name: 9_answers:eli5
dtype: string
- name: 9_answers:expert
dtype: string
- name: 9_additional_depth_q:follow up question
dtype: string
- name: 9_additional_depth_q:answers:eli5
dtype: string
- name: 9_additional_depth_q:answers:expert
dtype: string
- name: 9_additional_breath_q:follow up question
dtype: string
- name: 9_additional_breath_q:answers:eli5
dtype: string
- name: 9_additional_breath_q:answers:expert
dtype: string
splits:
- name: train
num_bytes: 54496
num_examples: 3
download_size: 267629
dataset_size: 54496
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AlienKevin/cantone | ---
license: mit
task_categories:
- audio-classification
language:
- yue
tags:
- speech
- cantonese
- yue
- syllable
- pronunciation
pretty_name: Cantone
size_categories:
- 10K<n<100K
---
# Cantone
A dataset of 34,489 recordings of Cantonese syllables by 10 speakers.
Those syllables are generated through the Cantonese speech synthesis engines of Amazon, Apple, Google, and Microsoft.
All recordings are stored as WAV files with the following format
* Channel: mono
* Sample rate: 16 kHz
* Bits per sample: 16
Here's a breakdown of the number of recordings under each speaker:
| Company | Speaker | # Syllables |
| --------|-------- | -------- |
| Amazon | Hiujin | 3,885 |
| Apple | Aasing | 2,977 |
| Apple | Sinji | 2,977 |
| Google | A | 3,653 |
| Google | B | 3,653 |
| Google | C | 3,653 |
| Google | D | 3,653 |
| Microsoft | Hiugaai | 3,349 |
| Microsoft | Hiumaan | 3,349 |
| Microsoft | Wanlung | 3,349 |
## Dataset Construction
1. Gathering
We first identified 3,904 common Cantonese syllables based on words.hk's syllable recordings.
The, we ask the speech synthesis APIs to pronounce each of the syllables.
The queries use SSML's phoneme attribute to precisely specify the syllable we want. Here's a sample SSML query that fetches the syllable jyut6:
```xml
<speak><phoneme alphabet='jyutping' ph='jyut6'></phoneme></speak>
```
Apple voices are gathered using jyutping text directly and a native Cantonese ASR system is used to filter out unsupported syllables.
2. Preprocessing
* All audios are converted to 16kHz WAV files
* Peak normalize all audios to -20 dBFS
* Clip silence at the beginning and end (sound below -50 dBFS are deemed silence)
3. Verification
Occassionally, some syllables are not synthesized correctly.
* Apple voices usually renders tone 5 syllables as tone 2: we remove all tone 5 syllables from apple voices
* Microsoft voices prepends consonants like ng, g, and b in front of isolate vowel syllables like aa: we remove all vowel syllables from microsoft voices
## License
MIT
|
llm-book/aio-passages | ---
language:
- ja
size_categories:
- 1M<n<10M
license:
- cc-by-sa-3.0
- gfdl
dataset_info:
features:
- name: id
dtype: int32
- name: pageid
dtype: int32
- name: revid
dtype: int32
- name: text
dtype: string
- name: section
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 3054493919
num_examples: 4288198
download_size: 1110830651
dataset_size: 3054493919
---
# Dataset Card for llm-book/aio-passages
書籍『大規模言語モデル入門』で使用する、「AI王」コンペティションのパッセージデータセットです。
GitHub リポジトリ [cl-tohoku/quiz-datasets](https://github.com/cl-tohoku/quiz-datasets) で公開されているデータセットを利用しています。
## Licence
本データセットで利用している Wikipedia のコンテンツは、[クリエイティブ・コモンズ表示・継承ライセンス 3.0 (CC BY-SA 3.0)](https://creativecommons.org/licenses/by-sa/3.0/deed.ja) および [GNU 自由文書ライセンス (GFDL)](https://www.gnu.org/licenses/fdl.html) の下に配布されているものです。
|
mzschwartz88/pgen1 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 290240925.18
num_examples: 1230
- name: validation
num_bytes: 75069315.0
num_examples: 306
download_size: 369495326
dataset_size: 365310240.18
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
Lkhagvasurenam/p | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 9144055.0
num_examples: 10
download_size: 9075232
dataset_size: 9144055.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
anujsahani01/Custom_dataset | ---
license: mit
---
|
hilongjw/box_border | ---
license: cc
size_categories:
- 10K<n<100K
task_categories:
- text-classification
tags:
- art
- code
--- |
CyberHarem/l_opiniatre_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of l_opiniatre/ルピニャート/倔强 (Azur Lane)
This is the dataset of l_opiniatre/ルピニャート/倔强 (Azur Lane), containing 38 images and their tags.
The core tags of this character are `long_hair, green_eyes, breasts, purple_hair, ahoge, glasses, bangs, very_long_hair, ribbon, bow, small_breasts, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 38 | 48.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/l_opiniatre_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 38 | 28.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/l_opiniatre_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 74 | 56.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/l_opiniatre_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 38 | 42.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/l_opiniatre_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 74 | 82.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/l_opiniatre_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/l_opiniatre_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | hair_ribbon, looking_at_viewer, red_ribbon, 1girl, cleavage, blush, purple_gloves, solo, blue_gloves, capelet, cross, semi-rimless_eyewear, white_thighhighs, black_choker |
| 1 | 9 |  |  |  |  |  | bare_shoulders, blush, looking_at_viewer, 1girl, hair_bow, hair_ornament, solo, bridal_garter, choker, frilled_bikini, navel, purple_bikini, red_bow, strapless_bikini, ass, bandeau, nail_polish, smile, stomach, thighs, blue_bikini, cross, earrings, eyewear_removed, groin, side_ponytail |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | hair_ribbon | looking_at_viewer | red_ribbon | 1girl | cleavage | blush | purple_gloves | solo | blue_gloves | capelet | cross | semi-rimless_eyewear | white_thighhighs | black_choker | bare_shoulders | hair_bow | hair_ornament | bridal_garter | choker | frilled_bikini | navel | purple_bikini | red_bow | strapless_bikini | ass | bandeau | nail_polish | smile | stomach | thighs | blue_bikini | earrings | eyewear_removed | groin | side_ponytail |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------|:--------------------|:-------------|:--------|:-----------|:--------|:----------------|:-------|:--------------|:----------|:--------|:-----------------------|:-------------------|:---------------|:-----------------|:-----------|:----------------|:----------------|:---------|:-----------------|:--------|:----------------|:----------|:-------------------|:------|:----------|:--------------|:--------|:----------|:---------|:--------------|:-----------|:------------------|:--------|:----------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | | X | | X | | X | | X | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
lunarlist/valid_depth0_clean | ---
license: apache-2.0
---
|
Kranajan/test-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 104225
num_examples: 284
download_size: 55095
dataset_size: 104225
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "test-llama2-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
GaJoPrograma/datasetVictoriaUNADGenerico | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 116670
num_examples: 83
download_size: 54732
dataset_size: 116670
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
lightonai/SwissProt-EC-leaf | ---
language:
- protein sequences
datasets:
- Swissprot
tags:
- Protein
- Enzyme Commission
---
# Dataset
Swissprot is a high quality manually annotated protein database. The dataset contains annotations with the functional properties of the proteins. Here we extract proteins with Enzyme Commission labels.
The dataset is ported from Protinfer: https://github.com/google-research/proteinfer.
The leaf level EC-labels are extracted and indexed, the mapping is provided in `idx_mapping.json`. Proteins without leaf-level-EC tags are removed.
## Example
The protein Q87BZ2 have the following EC tags.
EC:2.-.-.- (Transferases)
EC:2.7.-.- (Transferring phosphorus-containing groups)
EC:2.7.1.- (Phosphotransferases with an alcohol group as acceptor)
EC:2.7.1.30 (Glycerol kinase)
We only extract the leaf level labels, here EC:2.7.1.30, corresponding to glycerol kinase.
|
jens-lundell/cong | ---
license: mit
---
|
cesarali/test_ipp50 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: choices
sequence: string
- name: value
dtype: float64
splits:
- name: train
num_bytes: 8439
num_examples: 50
download_size: 4060
dataset_size: 8439
---
# Dataset Card for "test_ipp50"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Multimodal-Fatima/Food101_test_embeddings | ---
dataset_info:
features:
- name: image
dtype: image
- name: id
dtype: int64
- name: vision_embeddings
sequence: float32
splits:
- name: openai_clip_vit_large_patch14
num_bytes: 1352851340.5
num_examples: 25250
download_size: 1355827682
dataset_size: 1352851340.5
---
# Dataset Card for "Food101_test_embeddings"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jfischoff/super-channel-control-net-images | ---
license: openrail
---
|
irds/tripclick_val_head_dctr | ---
pretty_name: '`tripclick/val/head/dctr`'
viewer: false
source_datasets: ['irds/tripclick']
task_categories:
- text-retrieval
---
# Dataset Card for `tripclick/val/head/dctr`
The `tripclick/val/head/dctr` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/tripclick#tripclick/val/head/dctr).
# Data
This dataset provides:
- `qrels`: (relevance assessments); count=66,812
- For `docs`, use [`irds/tripclick`](https://huggingface.co/datasets/irds/tripclick)
## Usage
```python
from datasets import load_dataset
qrels = load_dataset('irds/tripclick_val_head_dctr', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Rekabsaz2021TripClick,
title={TripClick: The Log Files of a Large Health Web Search Engine},
author={Navid Rekabsaz and Oleg Lesota and Markus Schedl and Jon Brassey and Carsten Eickhoff},
year={2021},
booktitle={SIGIR}
}
```
|
nlee282/datasetV2 | ---
dataset_info:
features:
- name: category
dtype: string
- name: system
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 937711.0
num_examples: 1400
download_size: 514993
dataset_size: 937711.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MichiganNLP/ucf-101 | ---
task_categories:
- video-classification
language:
- en
pretty_name: UCF-101
---
A copy of UCF-101 with ZIP files instead of RAR. See https://www.crcv.ucf.edu/data/UCF101.php for more info.
|
appledora/conceptnet_en2en_relations | ---
license:
- cc-by-4.0
language:
- en
tags:
- common_sense
source_datasets:
- parsed
pretty_name: Conceptnet5En
---
## Dataset Description
> This is a subset of the [conceptnet5 dataset](https://huggingface.co/datasets/conceptnet5).
> I merely parsed and extracted out my required portion and uploaded here, since processing the huge complete dataset is complicated for many users.
> Please refer to the original authors' repo for a complete version.
```
ConceptNet is a multilingual knowledge base, representing words and
phrases that people use and the common-sense relationships between
them. The knowledge in ConceptNet is collected from a variety of
resources, including crowd-sourced resources (such as Wiktionary and
Open Mind Common Sense), games with a purpose (such as Verbosity and
nadya.jp), and expert-created resources (such as WordNet and JMDict).
You can browse what ConceptNet knows at http://conceptnet.io.
```
In this subset, I have extracted the relations `(37)` explicitly within the English language.
You can check out the [sample dataset](sample_dataset.csv) to get an idea about these relations, as well as visit the official [conceptnet wiki](https://github.com/commonsense/conceptnet5/wiki) for a comprehensive understanding.
There are `3409965` relationships in this dataset. I have parsed the [original assertions dataset](https://s3.amazonaws.com/conceptnet/downloads/2019/edges/conceptnet-assertions-5.7.0.csv.gz)
to nine columns.
```
- 'uri' : The complete conceptnet uri for the relationship. e.g., /a/[/r/Antonym/,/c/en/able/,/c/en/cane/]
- 'rel' : The type of binary relationship. e.g: r/Antonym
- 'start' : the first argument URI in the binary relationship. e.g., /c/en/able
- 'end' : the second argument URI in the binary relationship. e.g., /c/en/cane
- 'meta' : a string that includes json data that has the dataset name, license type (mostly cc-4.0), contributor, etc. e.g., : {"dataset": "/d/verbosity", "license": "cc:by/4.0", "sources": [{"contributor": "/s/resource/verbosity"}], "surfaceEnd": "cane", "surfaceStart": "able", "surfaceText": "[[able]] is the opposite of [[cane]]", "weight": 0.299}
- 'dataset' : dataset info parsed from the `meta` column. e.g: /d/verbosity
- 'source' : contains contributor information, curation process etc. parsed from the `meta` column. e.g: [{'contributor': '/s/resource/wiktionary/en', 'process': '/s/process/wikiparsec/2'}]
- 'concept1' : first parsed concept. e.g: able
- 'concept2' : second parsed concept e.g: cain
```
## Citation Information
Robyn Speer, Joshua Chin, and Catherine Havasi. 2017. "ConceptNet 5.5: An Open Multilingual Graph of General Knowledge." In proceedings of AAAI 31. |
guiifive/fivevoz | ---
license: openrail
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.