datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
CyberHarem/hoshiguma_yuugi_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hoshiguma_yuugi/星熊勇儀/호시구마유기 (Touhou)
This is the dataset of hoshiguma_yuugi/星熊勇儀/호시구마유기 (Touhou), containing 500 images and their tags.
The core tags of this character are `blonde_hair, horns, single_horn, long_hair, red_eyes, breasts, large_breasts, pointy_ears`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 679.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hoshiguma_yuugi_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 382.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hoshiguma_yuugi_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1182 | 774.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hoshiguma_yuugi_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 604.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hoshiguma_yuugi_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1182 | 1.07 GiB | [Download](https://huggingface.co/datasets/CyberHarem/hoshiguma_yuugi_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hoshiguma_yuugi_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, sakazuki, solo, chain, cuffs, sake, smile, geta, sitting |
| 1 | 5 |  |  |  |  |  | 1girl, chain, shirt, skirt, solo, shackles, grin, looking_at_viewer, short_sleeves |
| 2 | 12 |  |  |  |  |  | 1girl, shackles, short_sleeves, solo, star_(symbol), white_shirt, sakazuki, blue_skirt, chain, geta, looking_at_viewer, full_body, holding_cup, simple_background, white_background, red_horns, see-through, clenched_hand, grin, navel, sake, striped_skirt |
| 3 | 6 |  |  |  |  |  | 1girl, looking_at_viewer, smile, solo, simple_background, upper_body, white_background, white_shirt |
| 4 | 9 |  |  |  |  |  | 1girl, muscular_female, solo, nipples, nude, abs, huge_breasts, looking_at_viewer, navel, obliques, thick_thighs, grin, shackles |
| 5 | 22 |  |  |  |  |  | 1girl, futanari, huge_penis, nipples, testicles, abs, large_penis, solo, uncensored, muscular_female, looking_at_viewer, erection, huge_breasts, navel, very_long_hair, artist_name, blush, collarbone, completely_nude, oni, teeth, thick_thighs, veiny_penis, open_mouth, red_horns, simple_background, grin, steam, sweat, wet |
| 6 | 8 |  |  |  |  |  | 1boy, 1girl, hetero, solo_focus, blush, paizuri, penis, huge_breasts, nipples, pov, looking_at_viewer, mosaic_censoring, smile, cuffs, cum_on_breasts, fellatio, nude, shirt_lift, sweat, uncensored |
| 7 | 5 |  |  |  |  |  | 1girl, cleavage, fake_animal_ears, playboy_bunny, rabbit_ears, solo, detached_collar, looking_at_viewer, rabbit_tail, alternate_costume, bare_shoulders, black_leotard, fake_tail, grin, ponytail, red_bowtie, armpits, arms_up, bangs, brown_pantyhose, chain, collarbone, covered_navel, red_horns, shackles, simple_background, sitting, strapless_leotard, very_long_hair, white_background |
| 8 | 6 |  |  |  |  |  | 1girl, blush, nipples, solo, nude, anus, cum_in_pussy, spread_legs, bar_censor, cumdrip, open_mouth, spread_pussy |
| 9 | 5 |  |  |  |  |  | 1girl, enmaided, looking_at_viewer, maid_apron, maid_headdress, solo, white_apron, frilled_apron, waist_apron, bangs, blue_dress, chain, frilled_dress, full_body, holding, mary_janes, puffy_short_sleeves, shackles, twin_braids, white_thighhighs, back_bow, blue_footwear, bowtie, cleavage, closed_mouth, neck_ribbon, red_horns, sakazuki, simple_background, sitting, star_(symbol), white_background, white_bow |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | sakazuki | solo | chain | cuffs | sake | smile | geta | sitting | shirt | skirt | shackles | grin | looking_at_viewer | short_sleeves | star_(symbol) | white_shirt | blue_skirt | full_body | holding_cup | simple_background | white_background | red_horns | see-through | clenched_hand | navel | striped_skirt | upper_body | muscular_female | nipples | nude | abs | huge_breasts | obliques | thick_thighs | futanari | huge_penis | testicles | large_penis | uncensored | erection | very_long_hair | artist_name | blush | collarbone | completely_nude | oni | teeth | veiny_penis | open_mouth | steam | sweat | wet | 1boy | hetero | solo_focus | paizuri | penis | pov | mosaic_censoring | cum_on_breasts | fellatio | shirt_lift | cleavage | fake_animal_ears | playboy_bunny | rabbit_ears | detached_collar | rabbit_tail | alternate_costume | bare_shoulders | black_leotard | fake_tail | ponytail | red_bowtie | armpits | arms_up | bangs | brown_pantyhose | covered_navel | strapless_leotard | anus | cum_in_pussy | spread_legs | bar_censor | cumdrip | spread_pussy | enmaided | maid_apron | maid_headdress | white_apron | frilled_apron | waist_apron | blue_dress | frilled_dress | holding | mary_janes | puffy_short_sleeves | twin_braids | white_thighhighs | back_bow | blue_footwear | bowtie | closed_mouth | neck_ribbon | white_bow |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:-------|:--------|:--------|:-------|:--------|:-------|:----------|:--------|:--------|:-----------|:-------|:--------------------|:----------------|:----------------|:--------------|:-------------|:------------|:--------------|:--------------------|:-------------------|:------------|:--------------|:----------------|:--------|:----------------|:-------------|:------------------|:----------|:-------|:------|:---------------|:-----------|:---------------|:-----------|:-------------|:------------|:--------------|:-------------|:-----------|:-----------------|:--------------|:--------|:-------------|:------------------|:------|:--------|:--------------|:-------------|:--------|:--------|:------|:-------|:---------|:-------------|:----------|:--------|:------|:-------------------|:-----------------|:-----------|:-------------|:-----------|:-------------------|:----------------|:--------------|:------------------|:--------------|:--------------------|:-----------------|:----------------|:------------|:-----------|:-------------|:----------|:----------|:--------|:------------------|:----------------|:--------------------|:-------|:---------------|:--------------|:-------------|:----------|:---------------|:-----------|:-------------|:-----------------|:--------------|:----------------|:--------------|:-------------|:----------------|:----------|:-------------|:----------------------|:--------------|:-------------------|:-----------|:----------------|:---------|:---------------|:--------------|:------------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | X | X | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 12 |  |  |  |  |  | X | X | X | X | | X | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | | X | | | | X | | | | | | | X | | | X | | | | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | | X | | | | | | | | | X | X | X | | | | | | | | | | | | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 22 |  |  |  |  |  | X | | X | | | | | | | | | | X | X | | | | | | | X | | X | | | X | | | X | X | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 8 |  |  |  |  |  | X | | | | X | | X | | | | | | | X | | | | | | | | | | | | | | | | X | X | | X | | | | | | | X | | | | X | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | | X | X | | | | | X | | | X | X | X | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | X | X | X | | | | | X | | | X | | X | | X | | | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
diversifix/inclusive_words | ---
language: de
license: other
---
# Inclusive words in German 🏳️🌈 🇩🇪
Pairs of words and phrases in exclusive language and alternative words and phrases in inclusive language.
Inclusivity aims to comprehend all [dimensions of diversity](https://www.charta-der-vielfalt.de/en/understanding-diversity/diversity-dimensions/) (age, ethnic background and nationality, gender and gender identity, physical and mental abilities, religion and worldview, sexual orientation, social background, and more); but currently focuses almost exclusively on **gender inclusion**, since gender exclusion is very dominant in German language.
## Dataset structure
**Train/test split:** There is no train/test split, just a "train" dataset.
- **`exclusive`**: Exclusive words and phrases in the singular. For the dimension of gender, these are certain words and phrases in the grammatical masculine. Note that the grammatical masculine is only exclusive if it is used in a _generic_ sense: "Die Doktoren" may be accurately used to describe three male doctors, but the same phrase is exclusive when it intends to refer to a group that also (potentially) includes women and nonbinary people. The relation between exclusive and inclusive phrases is n-to-n: An exclusive phrase may occur in multiple rows with various inclusive phrases associated, and vice versa.
- **`inclusive`**: Corresponding inclusive word or phrase that can replace the exclusive phrase. It may be applicable only in a certain context and not in others. Usually in the singular; where `number` is plural, it may be either in the singular or plural. The relation between exclusive and inclusive phrases is n-to-n: An inclusive phrase may occur in multiple rows with various exclusive phrases associated, and vice versa.
- **`applicable`**: One of `in_singular`, `in_plural`, or `always`. Specifies the grammatical number that the inclusive phrase must be found in such that it can be replaced by the inclusive phrase given in this entry.
- _Special case:_ Some singular words (such as "Management" as a replacement for "Manager") occur in two rows, once with the attribute `always`, once with the attribute `plural`. The first means that "Manager"(singular) can be replaced with "Management" (singular) and "Manager" (plural) can be replaced with "Managements" (plural); the second means that "Manager" (plural) can (also) be replaced with "Management" (singular).
- **`gender_of_inclusive`**: Whether the inclusive phrase is semantically `neutral` or `female`. If it is female, it is not by itself inclusive but has to be combined with the male phrase (and potentially a character such as the gender star for representing nonbinary persons) to form a neutral phrase. (Since the male phrase is already given by the `exclusive` column, it is not repeated in the `inclusive` column due to potentially questionable ideological beliefs about data normalization.)
- **`source`**: The origin of the entry.
- _geschicktgendern_: The entry has been copied from the _Genderwörterbuch_ by _Geschickt Gendern_. These entries are under a CC-BY-NC-SA 4.0 International License (c) Johanna Usinger, [geschicktgendern.de](https://geschicktgendern.de/).
- _dereko_: The entry has been extracted from the German reference corpus [DeReKo](https://www.ids-mannheim.de/en/digspra/corpus-linguistics/projects/corpus-development/). Since these are single words only, copyright does not apply and the entries are under the CC-0 license.
- _diversifix_: Entries added by ourselves or our community, also under the CC-0 license.
## Bias
The entries from the `dereko` source have been extracted according to their frequency in the corpus. This means, for example, that there are words referring to people from larger countries but not from some smaller countries; or, more accurately, countries that are considered important from the perspective of German-speaking journalism are more prevalent in the dataset.
## License
Mixed license. All data is open, but a part of it only noncommercially. See the description for the `source` column above for details.
## See also
- [Other data sources on inclusive German.](https://github.com/tech4germany/bam-inclusify/blob/main/doc/data.md)
- [retext-equality](https://github.com/retextjs/retext-equality) 🏳️🌈 🇬🇧
|
am-infoweb/rap_phase2_26march_custom | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
dtype: string
splits:
- name: train
num_bytes: 34458336.0
num_examples: 31740
- name: test
num_bytes: 11486112.0
num_examples: 10580
download_size: 23444945
dataset_size: 45944448.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
ceadar-ie/AIVision360-8k | ---
license: apache-2.0
task_categories:
- question-answering
- conversational
- text-generation
language:
- en
tags:
- LLM
- Generative AI
- Finetune
- Domain Specific Data
size_categories:
- 1K<n<10K
---
# Dataset Card for AIVision360-8k
## Dataset Description
AIVision360 is the pioneering domain-specific dataset tailor-made for media and journalism, designed expressly for the instruction fine-tuning of Large Language Models (LLMs).\
The AIVision360-8k dataset is a curated collection sourced from "ainewshub.ie", a platform dedicated to Artificial Intelligence news from quality-controlled publishers. It is designed to provide a comprehensive representation of AI-related discussions, highlighting current developments and trends in the field. Each entry in the dataset contains three columns: "question", "response", and "context". These columns offer a structured view of AI news interactions, where the "question" and "response" provide insights on AI subjects, and the "context" column gives additional background information.
### Key Features
• Domain Specificity: The dataset is focused on AI news, catering to researchers, developers, and specialists in the domain.\
• Source Reliability: Data is sourced from established publishers featured on "ainewshub.ie", ensuring content reliability.\
• Licensing: It is distributed under the Apache 2.0 open-source license, facilitating its use and modification.\
• Accessibility: Intended for public use to support collaboration and analysis in the AI community.\
• Volume: Contains over 8,000 entries, making it a significant resource for AI news analysis.
### Intended Use Cases
• Model Training: Suitable for training language models, enhancing their capacity in AI news discussions.\
• Research: Useful for AI trend analysis, sentiment analysis, and linguistic pattern study.
### Limitations
• Despite careful curation, potential biases from AI news sources may persist in the dataset.\
• Its focus is on AI news, which may reflect specific perspectives of this niche.
## Language
English
### Data Privacy
The dataset comprises publicly available news articles and does not include private identifiers or sensitive information.
### License/Attribution
Copyright © 2023 CeADAR Connect Group. Developed by CeADAR (ceadar.ie), its use is governed by the Apache 2.0 license.
### Sources
Curated exclusively from ainewshub.ie, a recognized platform for AI news.
## Annotator Guidelines
• Question: Represents a query derived from the news article.\
• Response: Provides an answer based on the article's content.\
• Context: Offers background information for the query-answer pair.
### Feedback
For any questions or feedback related to the dataset, please direct your communications to ahtsham.zafar@ucd.ie
### Disclaimer
This dataset is provided "as is" without any guarantees or warranty. Although the data has been processed with care, CeADAR Connect Group is not responsible for any errors, omissions, or discrepancies within the data. Users are advised to use this dataset at their discretion and assume any risks associated with its use. |
Jalinvel3/Geneautry | ---
license: artistic-2.0
---
|
adityarana021/DEEPFRUlT-DATASET | ---
task_categories:
- text-classification
language:
- en
pretty_name: 'n'
--- |
kenjiqq/imagereward-evaluation | ---
license: cc0-1.0
---
|
sankettgorey/three_layouts | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 449387243.1132901
num_examples: 1442
- name: test
num_bytes: 55106176.92124237
num_examples: 181
- name: validation
num_bytes: 55521421.31946755
num_examples: 180
download_size: 469923853
dataset_size: 560014841.354
---
# Dataset Card for "three_layouts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Multimodal-Fatima/OxfordPets_facebook_opt_350m_LLM_Description_gpt3_downstream_tasks_ViT_L_14 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: text
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: test
num_bytes: 119984114.375
num_examples: 3669
download_size: 119029045
dataset_size: 119984114.375
---
# Dataset Card for "OxfordPets_facebook_opt_350m_LLM_Description_gpt3_downstream_tasks_ViT_L_14"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-36bd0b51-8375120 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- scientific_papers
eval_info:
task: summarization
model: google/bigbird-pegasus-large-pubmed
metrics: ['bertscore', 'meteor']
dataset_name: scientific_papers
dataset_config: pubmed
dataset_split: test
col_mapping:
text: article
target: abstract
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: google/bigbird-pegasus-large-pubmed
* Dataset: scientific_papers
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Blaise_g](https://huggingface.co/Blaise_g) for evaluating this model. |
wshi83/EHRAgent-treqs | ---
license: apache-2.0
---
|
Ajitava/go_emotions_multi_label | ---
license: mit
---
This is a dataset for multilabel emotion classification based on go emotion parameters.
This dataset was labeled by a team of 12 engineers (custom marked label).
This dataset also shows the evaluation result for 3 models viz. Roberta, Bert Cased, and Bert Uncased on this dataset. |
autoevaluate/autoeval-eval-lener_br-lener_br-851daf-1777161683 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- lener_br
eval_info:
task: entity_extraction
model: pierreguillou/ner-bert-large-cased-pt-lenerbr
metrics: []
dataset_name: lener_br
dataset_config: lener_br
dataset_split: train
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: pierreguillou/ner-bert-large-cased-pt-lenerbr
* Dataset: lener_br
* Config: lener_br
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Luciano](https://huggingface.co/Luciano) for evaluating this model. |
ZhaoweiWang/SubeventWriter | ---
license: mit
---
|
sankovic/shirimdataset | ---
license: openrail
---
|
habixia1/0k | ---
license: afl-3.0
---
|
liuyanchen1015/MULTI_VALUE_stsb_our_we | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 2853
num_examples: 14
- name: train
num_bytes: 76
num_examples: 1
download_size: 0
dataset_size: 2929
---
# Dataset Card for "MULTI_VALUE_stsb_our_we"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FaalSa/dataD | ---
dataset_info:
features:
- name: start
dtype: timestamp[s]
- name: target
sequence: float32
- name: item_id
dtype: string
- name: feat_static_cat
sequence: uint64
splits:
- name: train
num_bytes: 57629
num_examples: 1
- name: validation
num_bytes: 58109
num_examples: 1
- name: test
num_bytes: 58589
num_examples: 1
download_size: 35395
dataset_size: 174327
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Nicollas563/Uijjj | ---
license: openrail
---
|
kpriyanshu256/MultiTabQA-tapex-Salesforce-codet5-base-markdown | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
- name: source_latex
dtype: string
- name: target_latex
dtype: string
- name: source_html
dtype: string
- name: target_html
dtype: string
- name: source_markdown
dtype: string
- name: target_markdown
dtype: string
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 36766528167
num_examples: 1650977
- name: validation
num_bytes: 4087830371
num_examples: 183442
download_size: 7681286879
dataset_size: 40854358538
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
RIW/small_coco_test_1_1 | ---
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
- name: url
dtype: string
- name: key
dtype: string
- name: status
dtype: string
- name: error_message
dtype: 'null'
- name: width
dtype: int64
- name: height
dtype: int64
- name: original_width
dtype: int64
- name: original_height
dtype: int64
- name: exif
dtype: string
- name: sha256
dtype: string
- name: watermark
dtype: bool
splits:
- name: train
num_bytes: 816214224.2
num_examples: 9950
- name: validation
num_bytes: 885003521.915
num_examples: 8965
download_size: 362870789
dataset_size: 1701217746.115
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
t2chiang/test | ---
dataset_info:
features:
- name: xyz
sequence:
sequence: float64
- name: label
sequence:
sequence: bool
splits:
- name: resamplingTest
num_bytes: 484724304
num_examples: 458
download_size: 363884098
dataset_size: 484724304
configs:
- config_name: default
data_files:
- split: resamplingTest
path: data/resamplingTest-*
---
|
HydraLM/partitioned_v3_standardized_029 | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
- name: unique_id
dtype: string
splits:
- name: train
num_bytes: 9747733.340565363
num_examples: 18128
download_size: 9524643
dataset_size: 9747733.340565363
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "partitioned_v3_standardized_029"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jamestalentium/cnn_dailymail_10_rm | ---
dataset_info:
features:
- name: input_text
dtype: string
- name: output_text
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 43944.50216465294
num_examples: 10
download_size: 22784
dataset_size: 43944.50216465294
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "cnn_dailymail_10_rm"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlekseyKorshuk/gpt-roleplay-realm-chatml | ---
dataset_info:
features:
- name: conversation
list:
- name: content
dtype: string
- name: do_train
dtype: bool
- name: role
dtype: string
splits:
- name: train
num_bytes: 9428391
num_examples: 4536
download_size: 3208011
dataset_size: 9428391
---
# Dataset Card for "gpt-roleplay-realm-chatml"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liveqa | ---
annotations_creators:
- found
language_creators:
- found
language:
- zh
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- question-answering
task_ids:
- extractive-qa
paperswithcode_id: liveqa
pretty_name: LiveQA
dataset_info:
features:
- name: id
dtype: int64
- name: passages
sequence:
- name: is_question
dtype: bool
- name: text
dtype: string
- name: candidate1
dtype: string
- name: candidate2
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 112187507
num_examples: 1670
download_size: 114704569
dataset_size: 112187507
---
# Dataset Card for LiveQA
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Github](https://github.com/PKU-TANGENT/LiveQA)
- **Repository:** [Github](https://github.com/PKU-TANGENT/LiveQA)
- **Paper:** [Liu et al., 2020](https://www.aclweb.org/anthology/2020.ccl-1.98.pdf)
- **Leaderboard:** N/A
- **Point of Contact:** Qianying Liu
### Dataset Summary
The LiveQA dataset is a Chinese question-answering resource constructed from playby-play live broadcasts. It contains 117k multiple-choice questions written by human commentators for over 1,670 NBA games, which are collected from the Chinese Hupu website.
### Supported Tasks and Leaderboards
Question Answering.
[More Information Needed]
### Languages
Chinese.
## Dataset Structure
### Data Instances
Each instance represents a timeline (i.e., a game) with an identifier. The passages field comprise an array of text or question segments. In the following truncated example, user comments about the game is followed by a question about which team will be the first to reach 60 points.
```python
{
'id': 1,
'passages': [
{
"is_question": False,
"text": "'我希望两位球员都能做到!!",
"candidate1": "",
"candidate2": "",
"answer": "",
},
{
"is_question": False,
"text": "新年给我们送上精彩比赛!",
"candidate1": "",
"candidate2": "",
"answer": "",
},
{
"is_question": True,
"text": "先达到60分?",
"candidate1": "火箭",
"candidate2": "勇士",
"answer": "勇士",
},
{
"is_question": False,
"text": "自己急停跳投!!!",
"candidate1": "",
"candidate2": "",
"answer": "",
}
]
}
```
### Data Fields
- id: identifier for the game
- passages: collection of text/question segments
- text: real-time text comment or binary question related to the context
- candidate1/2: one of the two answer options to the question
- answer: correct answer to the question in text
### Data Splits
There is no predefined split in this dataset.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
This resource is developed by [Liu et al., 2020](https://www.aclweb.org/anthology/2020.ccl-1.98.pdf).
```
@inproceedings{qianying-etal-2020-liveqa,
title = "{L}ive{QA}: A Question Answering Dataset over Sports Live",
author = "Qianying, Liu and
Sicong, Jiang and
Yizhong, Wang and
Sujian, Li",
booktitle = "Proceedings of the 19th Chinese National Conference on Computational Linguistics",
month = oct,
year = "2020",
address = "Haikou, China",
publisher = "Chinese Information Processing Society of China",
url = "https://www.aclweb.org/anthology/2020.ccl-1.98",
pages = "1057--1067"
}
```
### Contributions
Thanks to [@j-chim](https://github.com/j-chim) for adding this dataset. |
open-llm-leaderboard/details_frankenmerger__delta-4B-scientific | ---
pretty_name: Evaluation run of frankenmerger/delta-4B-scientific
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [frankenmerger/delta-4B-scientific](https://huggingface.co/frankenmerger/delta-4B-scientific)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_frankenmerger__delta-4B-scientific\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-11T05:03:53.088812](https://huggingface.co/datasets/open-llm-leaderboard/details_frankenmerger__delta-4B-scientific/blob/main/results_2024-03-11T05-03-53.088812.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5763611715340329,\n\
\ \"acc_stderr\": 0.03364415142924134,\n \"acc_norm\": 0.5787195840594358,\n\
\ \"acc_norm_stderr\": 0.034335728498316835,\n \"mc1\": 0.3317013463892289,\n\
\ \"mc1_stderr\": 0.016482148810241473,\n \"mc2\": 0.48388057912772253,\n\
\ \"mc2_stderr\": 0.015377864755358938\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5665529010238908,\n \"acc_stderr\": 0.014481376224558902,\n\
\ \"acc_norm\": 0.5938566552901023,\n \"acc_norm_stderr\": 0.014351656690097862\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5517825134435371,\n\
\ \"acc_stderr\": 0.004962949784236048,\n \"acc_norm\": 0.7409878510256921,\n\
\ \"acc_norm_stderr\": 0.0043719695428145605\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.040335656678483205,\n\
\ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.040335656678483205\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.569811320754717,\n \"acc_stderr\": 0.030471445867183235,\n\
\ \"acc_norm\": 0.569811320754717,\n \"acc_norm_stderr\": 0.030471445867183235\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6319444444444444,\n\
\ \"acc_stderr\": 0.04032999053960718,\n \"acc_norm\": 0.6319444444444444,\n\
\ \"acc_norm_stderr\": 0.04032999053960718\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887249,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887249\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.03265019475033582,\n\
\ \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.03265019475033582\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.04514496132873634,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.04514496132873634\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4417989417989418,\n \"acc_stderr\": 0.025576257061253833,\n \"\
acc_norm\": 0.4417989417989418,\n \"acc_norm_stderr\": 0.025576257061253833\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377563,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377563\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7193548387096774,\n\
\ \"acc_stderr\": 0.02556060472102289,\n \"acc_norm\": 0.7193548387096774,\n\
\ \"acc_norm_stderr\": 0.02556060472102289\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0368105086916155,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0368105086916155\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533086,\n \"\
acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533086\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7927461139896373,\n \"acc_stderr\": 0.02925282329180363,\n\
\ \"acc_norm\": 0.7927461139896373,\n \"acc_norm_stderr\": 0.02925282329180363\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.024697216930878948,\n\
\ \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.024697216930878948\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608463,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608463\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7614678899082569,\n \"acc_stderr\": 0.01827257581023187,\n \"\
acc_norm\": 0.7614678899082569,\n \"acc_norm_stderr\": 0.01827257581023187\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.696078431372549,\n \"acc_stderr\": 0.03228210387037892,\n \"acc_norm\"\
: 0.696078431372549,\n \"acc_norm_stderr\": 0.03228210387037892\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \"\
acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n\
\ \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n\
\ \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467765,\n\
\ \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467765\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7863247863247863,\n\
\ \"acc_stderr\": 0.026853450377009154,\n \"acc_norm\": 0.7863247863247863,\n\
\ \"acc_norm_stderr\": 0.026853450377009154\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.685823754789272,\n\
\ \"acc_stderr\": 0.016599291735884904,\n \"acc_norm\": 0.685823754789272,\n\
\ \"acc_norm_stderr\": 0.016599291735884904\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.025416003773165538,\n\
\ \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.025416003773165538\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3039106145251397,\n\
\ \"acc_stderr\": 0.01538284558758452,\n \"acc_norm\": 0.3039106145251397,\n\
\ \"acc_norm_stderr\": 0.01538284558758452\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6274509803921569,\n \"acc_stderr\": 0.027684181883302895,\n\
\ \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.027684181883302895\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6366559485530546,\n\
\ \"acc_stderr\": 0.02731684767419271,\n \"acc_norm\": 0.6366559485530546,\n\
\ \"acc_norm_stderr\": 0.02731684767419271\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6080246913580247,\n \"acc_stderr\": 0.027163686038271146,\n\
\ \"acc_norm\": 0.6080246913580247,\n \"acc_norm_stderr\": 0.027163686038271146\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4432624113475177,\n \"acc_stderr\": 0.029634838473766,\n \
\ \"acc_norm\": 0.4432624113475177,\n \"acc_norm_stderr\": 0.029634838473766\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.40352020860495436,\n\
\ \"acc_stderr\": 0.012530241301193176,\n \"acc_norm\": 0.40352020860495436,\n\
\ \"acc_norm_stderr\": 0.012530241301193176\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.49264705882352944,\n \"acc_stderr\": 0.030369552523902173,\n\
\ \"acc_norm\": 0.49264705882352944,\n \"acc_norm_stderr\": 0.030369552523902173\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5343137254901961,\n \"acc_stderr\": 0.02018014484330729,\n \
\ \"acc_norm\": 0.5343137254901961,\n \"acc_norm_stderr\": 0.02018014484330729\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.673469387755102,\n \"acc_stderr\": 0.030021056238440303,\n\
\ \"acc_norm\": 0.673469387755102,\n \"acc_norm_stderr\": 0.030021056238440303\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n\
\ \"acc_stderr\": 0.02796267760476892,\n \"acc_norm\": 0.8059701492537313,\n\
\ \"acc_norm_stderr\": 0.02796267760476892\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036847,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036847\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6783625730994152,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.6783625730994152,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3317013463892289,\n\
\ \"mc1_stderr\": 0.016482148810241473,\n \"mc2\": 0.48388057912772253,\n\
\ \"mc2_stderr\": 0.015377864755358938\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7592738752959748,\n \"acc_stderr\": 0.012015559212224178\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.47081122062168307,\n \
\ \"acc_stderr\": 0.013748996794921794\n }\n}\n```"
repo_url: https://huggingface.co/frankenmerger/delta-4B-scientific
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|arc:challenge|25_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|gsm8k|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hellaswag|10_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T05-03-53.088812.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T05-03-53.088812.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- '**/details_harness|winogrande|5_2024-03-11T05-03-53.088812.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-11T05-03-53.088812.parquet'
- config_name: results
data_files:
- split: 2024_03_11T05_03_53.088812
path:
- results_2024-03-11T05-03-53.088812.parquet
- split: latest
path:
- results_2024-03-11T05-03-53.088812.parquet
---
# Dataset Card for Evaluation run of frankenmerger/delta-4B-scientific
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [frankenmerger/delta-4B-scientific](https://huggingface.co/frankenmerger/delta-4B-scientific) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_frankenmerger__delta-4B-scientific",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-11T05:03:53.088812](https://huggingface.co/datasets/open-llm-leaderboard/details_frankenmerger__delta-4B-scientific/blob/main/results_2024-03-11T05-03-53.088812.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5763611715340329,
"acc_stderr": 0.03364415142924134,
"acc_norm": 0.5787195840594358,
"acc_norm_stderr": 0.034335728498316835,
"mc1": 0.3317013463892289,
"mc1_stderr": 0.016482148810241473,
"mc2": 0.48388057912772253,
"mc2_stderr": 0.015377864755358938
},
"harness|arc:challenge|25": {
"acc": 0.5665529010238908,
"acc_stderr": 0.014481376224558902,
"acc_norm": 0.5938566552901023,
"acc_norm_stderr": 0.014351656690097862
},
"harness|hellaswag|10": {
"acc": 0.5517825134435371,
"acc_stderr": 0.004962949784236048,
"acc_norm": 0.7409878510256921,
"acc_norm_stderr": 0.0043719695428145605
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.040335656678483205,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.040335656678483205
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.569811320754717,
"acc_stderr": 0.030471445867183235,
"acc_norm": 0.569811320754717,
"acc_norm_stderr": 0.030471445867183235
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6319444444444444,
"acc_stderr": 0.04032999053960718,
"acc_norm": 0.6319444444444444,
"acc_norm_stderr": 0.04032999053960718
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887249,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887249
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4765957446808511,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.4765957446808511,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.04514496132873634,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.04514496132873634
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4417989417989418,
"acc_stderr": 0.025576257061253833,
"acc_norm": 0.4417989417989418,
"acc_norm_stderr": 0.025576257061253833
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377563,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377563
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7193548387096774,
"acc_stderr": 0.02556060472102289,
"acc_norm": 0.7193548387096774,
"acc_norm_stderr": 0.02556060472102289
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0368105086916155,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0368105086916155
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.03135305009533086,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.03135305009533086
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7927461139896373,
"acc_stderr": 0.02925282329180363,
"acc_norm": 0.7927461139896373,
"acc_norm_stderr": 0.02925282329180363
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6128205128205129,
"acc_stderr": 0.024697216930878948,
"acc_norm": 0.6128205128205129,
"acc_norm_stderr": 0.024697216930878948
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608463,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608463
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7614678899082569,
"acc_stderr": 0.01827257581023187,
"acc_norm": 0.7614678899082569,
"acc_norm_stderr": 0.01827257581023187
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.03228210387037892,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.03228210387037892
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7383966244725738,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.7383966244725738,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467765,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467765
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7863247863247863,
"acc_stderr": 0.026853450377009154,
"acc_norm": 0.7863247863247863,
"acc_norm_stderr": 0.026853450377009154
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.685823754789272,
"acc_stderr": 0.016599291735884904,
"acc_norm": 0.685823754789272,
"acc_norm_stderr": 0.016599291735884904
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.025416003773165538,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.025416003773165538
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3039106145251397,
"acc_stderr": 0.01538284558758452,
"acc_norm": 0.3039106145251397,
"acc_norm_stderr": 0.01538284558758452
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.027684181883302895,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.027684181883302895
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6366559485530546,
"acc_stderr": 0.02731684767419271,
"acc_norm": 0.6366559485530546,
"acc_norm_stderr": 0.02731684767419271
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6080246913580247,
"acc_stderr": 0.027163686038271146,
"acc_norm": 0.6080246913580247,
"acc_norm_stderr": 0.027163686038271146
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4432624113475177,
"acc_stderr": 0.029634838473766,
"acc_norm": 0.4432624113475177,
"acc_norm_stderr": 0.029634838473766
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.40352020860495436,
"acc_stderr": 0.012530241301193176,
"acc_norm": 0.40352020860495436,
"acc_norm_stderr": 0.012530241301193176
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.49264705882352944,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.49264705882352944,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5343137254901961,
"acc_stderr": 0.02018014484330729,
"acc_norm": 0.5343137254901961,
"acc_norm_stderr": 0.02018014484330729
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.673469387755102,
"acc_stderr": 0.030021056238440303,
"acc_norm": 0.673469387755102,
"acc_norm_stderr": 0.030021056238440303
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.02796267760476892,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.02796267760476892
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036847,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036847
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6783625730994152,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.6783625730994152,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3317013463892289,
"mc1_stderr": 0.016482148810241473,
"mc2": 0.48388057912772253,
"mc2_stderr": 0.015377864755358938
},
"harness|winogrande|5": {
"acc": 0.7592738752959748,
"acc_stderr": 0.012015559212224178
},
"harness|gsm8k|5": {
"acc": 0.47081122062168307,
"acc_stderr": 0.013748996794921794
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
sayan1101/reward_test_custom_dataset_RLHF | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid1
path: data/valid1-*
- split: valid2
path: data/valid2-*
dataset_info:
features:
- name: chosen
dtype: string
- name: prompt
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 27648
num_examples: 41
- name: test
num_bytes: 27648
num_examples: 41
- name: valid1
num_bytes: 27648
num_examples: 41
- name: valid2
num_bytes: 27648
num_examples: 41
download_size: 101852
dataset_size: 110592
---
# Dataset Card for "reward_test_custom_dataset_RLHF"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mike-ravkine/rosettacode-parsed | ---
license: gfdl
task_categories:
- text-generation
language:
- en
- code
---
## Data Origins
Original dataset: https://huggingface.co/datasets/jondurbin/rosettacode-raw/
Cleaner code: https://github.com/the-crypt-keeper/rosettacode-parser
## Data Fields
|Field|Type|Description|
|---|---|---|
|title|string|problem title|
|task|string|problem description|
|language|string|solution language/variant|
|soulution|string|solution source code|
## Languages
One .jsonl is provided per language group, the sublanguage field in the data denotes the specific language version/variant or the source language the example was ported from.
```
Language Python problems 510 rows 621
Language C problems 350 rows 350
Language C++ problems 403 rows 416
Language C sharp problems 322 rows 342
Language Go problems 496 rows 503
Language JavaScript problems 269 rows 301
Language Java problems 470 rows 512
Language Lua problems 335 rows 339
Language Kotlin problems 435 rows 435
Language Ruby problems 418 rows 444
Total 4894 done 565 skip 4329 failed 0 rows 4263
``` |
alperiox/cctv_pistols | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1167185.0
num_examples: 20
download_size: 520754
dataset_size: 1167185.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yyu/arxiv-attrprompt | ---
license: apache-2.0
task_categories:
- text-classification
language:
- en
tags:
- multilabel_classification
- arxiv
- scientific_papers
size_categories:
- 10K<n<100K
version:
- V1
---
This is the data used in the paper [Large Language Model as Attributed Training Data Generator: A Tale of Diversity and Bias](https://github.com/yueyu1030/AttrPrompt).
See the paper: https://arxiv.org/abs/2306.15895 for details.
- `label.txt`: the label name for each class
- `train.jsonl`: The original training set.
- `valid.jsonl`: The original validation set.
- `test.jsonl`: The original test set.
- `simprompt.jsonl`: The training data generated by the simple prompt.
- `attrprompt.jsonl`: The training data generated by the attributed prompt.
**Note**: Different than the other datasets, the `labels` for training/validation/test data are all a *list* instead of an integer as it is a multi-label classification dataset. |
tykimos/company_rules | ---
license: afl-3.0
---
|
alexshengzhili/SciCapInstructed410K | ---
license: mit
dataset_info:
features:
- name: image_file
dtype: string
- name: id
dtype: string
- name: caption
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: first_mention
dtype: string
- name: response
dtype: string
splits:
- name: validation
num_bytes: 246101
num_examples: 93
- name: train
num_bytes: 991847836
num_examples: 352018
download_size: 524856499
dataset_size: 992093937
---
|
sukantan/nyaya-st-training | ---
dataset_info:
features:
- name: test_id
dtype: string
- name: act
dtype: string
- name: section_no
dtype: string
- name: case_matter
dtype: string
- name: section_part
dtype: string
splits:
- name: train
num_bytes: 17923796
num_examples: 6252
download_size: 375286
dataset_size: 17923796
---
# Dataset Card for "nyaya-st-training"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
allenai/qasc | ---
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- question-answering
- multiple-choice
task_ids:
- extractive-qa
- multiple-choice-qa
paperswithcode_id: qasc
pretty_name: Question Answering via Sentence Composition (QASC)
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
sequence:
- name: text
dtype: string
- name: label
dtype: string
- name: answerKey
dtype: string
- name: fact1
dtype: string
- name: fact2
dtype: string
- name: combinedfact
dtype: string
- name: formatted_question
dtype: string
splits:
- name: train
num_bytes: 4891878
num_examples: 8134
- name: test
num_bytes: 390534
num_examples: 920
- name: validation
num_bytes: 559180
num_examples: 926
download_size: 2349698
dataset_size: 5841592
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
# Dataset Card for "qasc"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://allenai.org/data/qasc](https://allenai.org/data/qasc)
- **Repository:** https://github.com/allenai/qasc/
- **Paper:** [QASC: A Dataset for Question Answering via Sentence Composition](https://arxiv.org/abs/1910.11473)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 1.61 MB
- **Size of the generated dataset:** 5.87 MB
- **Total amount of disk used:** 7.49 MB
### Dataset Summary
QASC is a question-answering dataset with a focus on sentence composition. It consists of 9,980 8-way multiple-choice
questions about grade school science (8,134 train, 926 dev, 920 test), and comes with a corpus of 17M sentences.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### default
- **Size of downloaded dataset files:** 1.61 MB
- **Size of the generated dataset:** 5.87 MB
- **Total amount of disk used:** 7.49 MB
An example of 'validation' looks as follows.
```
{
"answerKey": "F",
"choices": {
"label": ["A", "B", "C", "D", "E", "F", "G", "H"],
"text": ["sand", "occurs over a wide range", "forests", "Global warming", "rapid changes occur", "local weather conditions", "measure of motion", "city life"]
},
"combinedfact": "Climate is generally described in terms of local weather conditions",
"fact1": "Climate is generally described in terms of temperature and moisture.",
"fact2": "Fire behavior is driven by local weather conditions such as winds, temperature and moisture.",
"formatted_question": "Climate is generally described in terms of what? (A) sand (B) occurs over a wide range (C) forests (D) Global warming (E) rapid changes occur (F) local weather conditions (G) measure of motion (H) city life",
"id": "3NGI5ARFTT4HNGVWXAMLNBMFA0U1PG",
"question": "Climate is generally described in terms of what?"
}
```
### Data Fields
The data fields are the same among all splits.
#### default
- `id`: a `string` feature.
- `question`: a `string` feature.
- `choices`: a dictionary feature containing:
- `text`: a `string` feature.
- `label`: a `string` feature.
- `answerKey`: a `string` feature.
- `fact1`: a `string` feature.
- `fact2`: a `string` feature.
- `combinedfact`: a `string` feature.
- `formatted_question`: a `string` feature.
### Data Splits
| name |train|validation|test|
|-------|----:|---------:|---:|
|default| 8134| 926| 920|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
The dataset is released under [CC BY 4.0](https://creativecommons.org/licenses/by/4.0/) license.
### Citation Information
```
@article{allenai:qasc,
author = {Tushar Khot and Peter Clark and Michal Guerquin and Peter Jansen and Ashish Sabharwal},
title = {QASC: A Dataset for Question Answering via Sentence Composition},
journal = {arXiv:1910.11473v2},
year = {2020},
}
```
### Contributions
Thanks to [@thomwolf](https://github.com/thomwolf), [@patrickvonplaten](https://github.com/patrickvonplaten), [@lewtun](https://github.com/lewtun) for adding this dataset. |
tqfang229/COM2-commonsense | ---
license: mit
---
|
FanChen0116/bus_few4_40x_empty | ---
dataset_info:
features:
- name: id
dtype: int64
- name: tokens
sequence: string
- name: labels
sequence:
class_label:
names:
'0': O
'1': I-from_location
'2': B-from_location
'3': B-leaving_date
'4': I-leaving_date
'5': I-to_location
'6': B-to_location
- name: request_slot
sequence: string
splits:
- name: train
num_bytes: 485547
num_examples: 2800
- name: validation
num_bytes: 6128
num_examples: 35
- name: test
num_bytes: 70618
num_examples: 377
download_size: 0
dataset_size: 562293
---
# Dataset Card for "bus_few4_40x_empty"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Alienmaster/omp_sa | ---
annotations_creators:
- expert-generated
language_creators:
- crowdsourced
language:
- de
license:
- cc-by-nc-sa-4.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
tags:
- Sentiment Analysis
task_categories:
- text-classification
pretty_name: One Million Posts Corpus - Sentiment Subset
configs:
- config_name: default
column_names: ["ID_Post","Headline","Body","Category"]
data_files:
- split: "full"
path: "full.csv"
---
# Dataset Card for One Million Posts Corpus - Sentiment Subset
## Dataset Description
- **Homepage:** https://ofai.github.io/million-post-corpus/
- **Repository:** https://github.com/OFAI/million-post-corpus
- **Paper:** https://dl.acm.org/doi/10.1145/3077136.3080711
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
The “One Million Posts” corpus is an annotated data set consisting of user comments posted to an Austrian newspaper website (in German language).
This subset of the original dataset only containing Post IDs, Headlines and Bodys of Posts with the Sentiment label.
The Sentiment labels are renamed to "Positive", "Negative" and "Neutral" for convenience.
If you are intrested in the full dataset use the official [dataset](https://huggingface.co/datasets/omp) on huggingface.
### Licensing Information
This data set is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
### Citation Information
```
@InProceedings{Schabus2018,
author = {Dietmar Schabus and Marcin Skowron},
title = {Academic-Industrial Perspective on the Development and Deployment of a Moderation System for a Newspaper Website},
booktitle = {Proceedings of the 11th International Conference on Language Resources and Evaluation (LREC)},
year = {2018},
address = {Miyazaki, Japan},
month = may,
pages = {1602-1605},
abstract = {This paper describes an approach and our experiences from the development, deployment and usability testing of a Natural Language Processing (NLP) and Information Retrieval system that supports the moderation of user comments on a large newspaper website. We highlight some of the differences between industry-oriented and academic research settings and their influence on the decisions made in the data collection and annotation processes, selection of document representation and machine learning methods. We report on classification results, where the problems to solve and the data to work with come from a commercial enterprise. In this context typical for NLP research, we discuss relevant industrial aspects. We believe that the challenges faced as well as the solutions proposed for addressing them can provide insights to others working in a similar setting.},
url = {http://www.lrec-conf.org/proceedings/lrec2018/summaries/8885.html},
}
```
|
C-MTEB/OnlineShopping-classification | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: cat
dtype: string
- name: label
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 1535074.0115334373
num_examples: 8000
- name: test
num_bytes: 191884.25144167966
num_examples: 1000
download_size: 1139002
dataset_size: 1726958.262975117
---
# Dataset Card for "OnlineShopping-classification"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Samee-ur__NeuralPipe-7B-slerp | ---
pretty_name: Evaluation run of Samee-ur/NeuralPipe-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Samee-ur/NeuralPipe-7B-slerp](https://huggingface.co/Samee-ur/NeuralPipe-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Samee-ur__NeuralPipe-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T03:25:19.988005](https://huggingface.co/datasets/open-llm-leaderboard/details_Samee-ur__NeuralPipe-7B-slerp/blob/main/results_2024-02-02T03-25-19.988005.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6444688446653744,\n\
\ \"acc_stderr\": 0.03217564834975917,\n \"acc_norm\": 0.6448609553287138,\n\
\ \"acc_norm_stderr\": 0.032833467276313325,\n \"mc1\": 0.4283965728274174,\n\
\ \"mc1_stderr\": 0.017323088597314754,\n \"mc2\": 0.5985018412437423,\n\
\ \"mc2_stderr\": 0.01514980059720055\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6476109215017065,\n \"acc_stderr\": 0.013960142600598675,\n\
\ \"acc_norm\": 0.6774744027303754,\n \"acc_norm_stderr\": 0.013659980894277364\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6700856403106951,\n\
\ \"acc_stderr\": 0.004692208279690595,\n \"acc_norm\": 0.8616809400517825,\n\
\ \"acc_norm_stderr\": 0.0034452899250117337\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n\
\ \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.036430371689585475,\n\
\ \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.036430371689585475\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n\
\ \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n\
\ \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n\
\ \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n\
\ \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n \
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"\
acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778405,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778405\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"\
acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"\
acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603346,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603346\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.02412112541694119,\n \
\ \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.02412112541694119\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8550458715596331,\n \"acc_stderr\": 0.01509421569970048,\n \"\
acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.01509421569970048\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.034076320938540516,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.034076320938540516\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"\
acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.0257449025322909,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.0257449025322909\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8352490421455939,\n\
\ \"acc_stderr\": 0.013265346261323793,\n \"acc_norm\": 0.8352490421455939,\n\
\ \"acc_norm_stderr\": 0.013265346261323793\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468365,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468365\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36312849162011174,\n\
\ \"acc_stderr\": 0.016083749986853697,\n \"acc_norm\": 0.36312849162011174,\n\
\ \"acc_norm_stderr\": 0.016083749986853697\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.02495418432487991,\n\
\ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.02495418432487991\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n\
\ \"acc_stderr\": 0.012751075788015058,\n \"acc_norm\": 0.4726205997392438,\n\
\ \"acc_norm_stderr\": 0.012751075788015058\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170598,\n\
\ \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170598\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.01895088677080631,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.01895088677080631\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399673,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399673\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4283965728274174,\n\
\ \"mc1_stderr\": 0.017323088597314754,\n \"mc2\": 0.5985018412437423,\n\
\ \"mc2_stderr\": 0.01514980059720055\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8018942383583267,\n \"acc_stderr\": 0.01120186274448705\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6853677028051555,\n \
\ \"acc_stderr\": 0.01279103722733604\n }\n}\n```"
repo_url: https://huggingface.co/Samee-ur/NeuralPipe-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|arc:challenge|25_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|gsm8k|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hellaswag|10_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T03-25-19.988005.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T03-25-19.988005.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- '**/details_harness|winogrande|5_2024-02-02T03-25-19.988005.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T03-25-19.988005.parquet'
- config_name: results
data_files:
- split: 2024_02_02T03_25_19.988005
path:
- results_2024-02-02T03-25-19.988005.parquet
- split: latest
path:
- results_2024-02-02T03-25-19.988005.parquet
---
# Dataset Card for Evaluation run of Samee-ur/NeuralPipe-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Samee-ur/NeuralPipe-7B-slerp](https://huggingface.co/Samee-ur/NeuralPipe-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Samee-ur__NeuralPipe-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T03:25:19.988005](https://huggingface.co/datasets/open-llm-leaderboard/details_Samee-ur__NeuralPipe-7B-slerp/blob/main/results_2024-02-02T03-25-19.988005.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6444688446653744,
"acc_stderr": 0.03217564834975917,
"acc_norm": 0.6448609553287138,
"acc_norm_stderr": 0.032833467276313325,
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314754,
"mc2": 0.5985018412437423,
"mc2_stderr": 0.01514980059720055
},
"harness|arc:challenge|25": {
"acc": 0.6476109215017065,
"acc_stderr": 0.013960142600598675,
"acc_norm": 0.6774744027303754,
"acc_norm_stderr": 0.013659980894277364
},
"harness|hellaswag|10": {
"acc": 0.6700856403106951,
"acc_stderr": 0.004692208279690595,
"acc_norm": 0.8616809400517825,
"acc_norm_stderr": 0.0034452899250117337
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778405,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778405
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603346,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603346
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.02412112541694119,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.02412112541694119
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297793,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8550458715596331,
"acc_stderr": 0.01509421569970048,
"acc_norm": 0.8550458715596331,
"acc_norm_stderr": 0.01509421569970048
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474082,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.0257449025322909,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.0257449025322909
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8352490421455939,
"acc_stderr": 0.013265346261323793,
"acc_norm": 0.8352490421455939,
"acc_norm_stderr": 0.013265346261323793
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468365,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468365
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36312849162011174,
"acc_stderr": 0.016083749986853697,
"acc_norm": 0.36312849162011174,
"acc_norm_stderr": 0.016083749986853697
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.012751075788015058,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.012751075788015058
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170598,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170598
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.01895088677080631,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.01895088677080631
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399673,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399673
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314754,
"mc2": 0.5985018412437423,
"mc2_stderr": 0.01514980059720055
},
"harness|winogrande|5": {
"acc": 0.8018942383583267,
"acc_stderr": 0.01120186274448705
},
"harness|gsm8k|5": {
"acc": 0.6853677028051555,
"acc_stderr": 0.01279103722733604
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_mychen76__mistral-7b-merged-slerp | ---
pretty_name: Evaluation run of mychen76/mistral-7b-merged-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mychen76/mistral-7b-merged-slerp](https://huggingface.co/mychen76/mistral-7b-merged-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mychen76__mistral-7b-merged-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T11:04:57.263703](https://huggingface.co/datasets/open-llm-leaderboard/details_mychen76__mistral-7b-merged-slerp/blob/main/results_2024-03-10T11-04-57.263703.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6444688446653744,\n\
\ \"acc_stderr\": 0.03217564834975917,\n \"acc_norm\": 0.6448609553287138,\n\
\ \"acc_norm_stderr\": 0.032833467276313325,\n \"mc1\": 0.4283965728274174,\n\
\ \"mc1_stderr\": 0.017323088597314754,\n \"mc2\": 0.5985018412437423,\n\
\ \"mc2_stderr\": 0.01514980059720055\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6476109215017065,\n \"acc_stderr\": 0.013960142600598675,\n\
\ \"acc_norm\": 0.6774744027303754,\n \"acc_norm_stderr\": 0.013659980894277364\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6700856403106951,\n\
\ \"acc_stderr\": 0.004692208279690595,\n \"acc_norm\": 0.8616809400517825,\n\
\ \"acc_norm_stderr\": 0.0034452899250117337\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n\
\ \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.036430371689585475,\n\
\ \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.036430371689585475\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n\
\ \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n\
\ \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n\
\ \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n\
\ \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n \
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"\
acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778405,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778405\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"\
acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"\
acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603346,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603346\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.02412112541694119,\n \
\ \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.02412112541694119\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8550458715596331,\n \"acc_stderr\": 0.01509421569970048,\n \"\
acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.01509421569970048\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.034076320938540516,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.034076320938540516\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"\
acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.0257449025322909,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.0257449025322909\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8352490421455939,\n\
\ \"acc_stderr\": 0.013265346261323793,\n \"acc_norm\": 0.8352490421455939,\n\
\ \"acc_norm_stderr\": 0.013265346261323793\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468365,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468365\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36312849162011174,\n\
\ \"acc_stderr\": 0.016083749986853697,\n \"acc_norm\": 0.36312849162011174,\n\
\ \"acc_norm_stderr\": 0.016083749986853697\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.02495418432487991,\n\
\ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.02495418432487991\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n\
\ \"acc_stderr\": 0.012751075788015058,\n \"acc_norm\": 0.4726205997392438,\n\
\ \"acc_norm_stderr\": 0.012751075788015058\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170598,\n\
\ \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170598\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.01895088677080631,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.01895088677080631\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399673,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399673\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4283965728274174,\n\
\ \"mc1_stderr\": 0.017323088597314754,\n \"mc2\": 0.5985018412437423,\n\
\ \"mc2_stderr\": 0.01514980059720055\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8018942383583267,\n \"acc_stderr\": 0.01120186274448705\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6853677028051555,\n \
\ \"acc_stderr\": 0.01279103722733604\n }\n}\n```"
repo_url: https://huggingface.co/mychen76/mistral-7b-merged-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|arc:challenge|25_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|gsm8k|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hellaswag|10_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|winogrande|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T11-04-57.263703.parquet'
- config_name: results
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- results_2024-03-10T11-04-57.263703.parquet
- split: latest
path:
- results_2024-03-10T11-04-57.263703.parquet
---
# Dataset Card for Evaluation run of mychen76/mistral-7b-merged-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mychen76/mistral-7b-merged-slerp](https://huggingface.co/mychen76/mistral-7b-merged-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mychen76__mistral-7b-merged-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T11:04:57.263703](https://huggingface.co/datasets/open-llm-leaderboard/details_mychen76__mistral-7b-merged-slerp/blob/main/results_2024-03-10T11-04-57.263703.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6444688446653744,
"acc_stderr": 0.03217564834975917,
"acc_norm": 0.6448609553287138,
"acc_norm_stderr": 0.032833467276313325,
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314754,
"mc2": 0.5985018412437423,
"mc2_stderr": 0.01514980059720055
},
"harness|arc:challenge|25": {
"acc": 0.6476109215017065,
"acc_stderr": 0.013960142600598675,
"acc_norm": 0.6774744027303754,
"acc_norm_stderr": 0.013659980894277364
},
"harness|hellaswag|10": {
"acc": 0.6700856403106951,
"acc_stderr": 0.004692208279690595,
"acc_norm": 0.8616809400517825,
"acc_norm_stderr": 0.0034452899250117337
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778405,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778405
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603346,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603346
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.02412112541694119,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.02412112541694119
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297793,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8550458715596331,
"acc_stderr": 0.01509421569970048,
"acc_norm": 0.8550458715596331,
"acc_norm_stderr": 0.01509421569970048
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474082,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.0257449025322909,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.0257449025322909
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8352490421455939,
"acc_stderr": 0.013265346261323793,
"acc_norm": 0.8352490421455939,
"acc_norm_stderr": 0.013265346261323793
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468365,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468365
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36312849162011174,
"acc_stderr": 0.016083749986853697,
"acc_norm": 0.36312849162011174,
"acc_norm_stderr": 0.016083749986853697
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.012751075788015058,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.012751075788015058
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170598,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170598
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.01895088677080631,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.01895088677080631
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399673,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399673
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314754,
"mc2": 0.5985018412437423,
"mc2_stderr": 0.01514980059720055
},
"harness|winogrande|5": {
"acc": 0.8018942383583267,
"acc_stderr": 0.01120186274448705
},
"harness|gsm8k|5": {
"acc": 0.6853677028051555,
"acc_stderr": 0.01279103722733604
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
AlekseyKorshuk/vicuna-v0-lmgym | ---
dataset_info:
features:
- name: source
dtype: string
- name: conversation
list:
- name: content
dtype: string
- name: do_train
dtype: bool
- name: role
dtype: string
- name: input_text
dtype: string
- name: output_text
dtype: string
splits:
- name: train
num_bytes: 2231621549
num_examples: 268680
download_size: 1067136760
dataset_size: 2231621549
---
# Dataset Card for "vicuna-v0-lmgym"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/imdb_affix | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': neg
'1': pos
- name: words_with_affixes
dtype: 'null'
splits:
- name: test
download_size: 1015
dataset_size: 0
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "imdb_affix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Crad/pl-wiki | ---
language:
- pl
tags:
- wikipedia
--- |
rathi2023/binn_nhood | ---
dataset_info:
features:
- name: image
dtype: image
- name: objects
struct:
- name: Ids
sequence: string
- name: captions
sequence: string
- name: quantities
sequence: int64
splits:
- name: train
num_bytes: 233615.0
num_examples: 4
download_size: 236436
dataset_size: 233615.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yankihue/turkish-news-categories | ---
language:
- tr
--- |
CyberHarem/akane_pokemon | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of akane/アカネ (Pokémon)
This is the dataset of akane/アカネ (Pokémon), containing 500 images and their tags.
The core tags of this character are `pink_hair, breasts, twintails, pink_eyes, hair_ornament, hairclip, large_breasts, bangs, long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 413.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akane_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 273.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akane_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1067 | 533.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akane_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 381.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akane_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1067 | 696.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akane_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/akane_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1boy, 1girl, hetero, nipples, nude, penis, blush, pov, solo_focus, cum, fellatio, looking_at_viewer, paizuri, huge_breasts, sweat, censored, heart-shaped_pupils |
| 1 | 6 |  |  |  |  |  | 1girl, buttons, eyelashes, looking_at_viewer, open_mouth, smile, tongue, white_jacket, blue_shorts, short_sleeves, ;d, heart, one_eye_closed, pokemon_(creature), solo, shirt, short_shorts, wristband |
| 2 | 10 |  |  |  |  |  | 1girl, open_mouth, solo, blush, smile, looking_at_viewer, heart, huge_breasts |
| 3 | 5 |  |  |  |  |  | 1girl, blush, nipples, open_shirt, solo, looking_at_viewer, open_mouth, breasts_out, buttons, navel, smile, collarbone, no_bra, shorts, simple_background |
| 4 | 6 |  |  |  |  |  | 1girl, blush, nipples, nude, solo, pussy, lactation, navel, open_mouth |
| 5 | 6 |  |  |  |  |  | :d, official_alternate_costume, open_mouth, tongue, 1girl, blush, christmas, eyelashes, gloves, red_headwear, santa_hat, brown_belt, dress, closed_eyes, detached_sleeves, pokemon_(creature), white_shorts |
| 6 | 19 |  |  |  |  |  | 1girl, hetero, nipples, 1boy, penis, sex, vaginal, blush, solo_focus, open_mouth, nude, spread_legs, mosaic_censoring, cum_in_pussy, uncensored |
| 7 | 9 |  |  |  |  |  | 1girl, cow_print, solo, collar, huge_breasts, blush, cow_horns, elbow_gloves, neck_bell, cow_ears, cow_tail, open_mouth, cowbell, thighhighs, areola_slip, cow_girl, looking_at_viewer, smile, sweat |
| 8 | 10 |  |  |  |  |  | 1girl, cow_print, hetero, 1boy, blush, cow_horns, huge_breasts, nipples, cow_ears, cowbell, neck_bell, solo_focus, collar, fake_animal_ears, nude, open_mouth, heart, penis, bikini, elbow_gloves, paizuri, sex, simple_background, sweat, tongue, white_background |
| 9 | 6 |  |  |  |  |  | 1boy, 1girl, blush, penis, smile, bikini, cow_print, solo_focus, breasts_squeezed_together, open_mouth, cum_on_breasts, heart, mosaic_censoring, paizuri_under_clothes, sweat |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1boy | 1girl | hetero | nipples | nude | penis | blush | pov | solo_focus | cum | fellatio | looking_at_viewer | paizuri | huge_breasts | sweat | censored | heart-shaped_pupils | buttons | eyelashes | open_mouth | smile | tongue | white_jacket | blue_shorts | short_sleeves | ;d | heart | one_eye_closed | pokemon_(creature) | solo | shirt | short_shorts | wristband | open_shirt | breasts_out | navel | collarbone | no_bra | shorts | simple_background | pussy | lactation | :d | official_alternate_costume | christmas | gloves | red_headwear | santa_hat | brown_belt | dress | closed_eyes | detached_sleeves | white_shorts | sex | vaginal | spread_legs | mosaic_censoring | cum_in_pussy | uncensored | cow_print | collar | cow_horns | elbow_gloves | neck_bell | cow_ears | cow_tail | cowbell | thighhighs | areola_slip | cow_girl | fake_animal_ears | bikini | white_background | breasts_squeezed_together | cum_on_breasts | paizuri_under_clothes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------|:--------|:---------|:----------|:-------|:--------|:--------|:------|:-------------|:------|:-----------|:--------------------|:----------|:---------------|:--------|:-----------|:----------------------|:----------|:------------|:-------------|:--------|:---------|:---------------|:--------------|:----------------|:-----|:--------|:-----------------|:---------------------|:-------|:--------|:---------------|:------------|:-------------|:--------------|:--------|:-------------|:---------|:---------|:--------------------|:--------|:------------|:-----|:-----------------------------|:------------|:---------|:---------------|:------------|:-------------|:--------|:--------------|:-------------------|:---------------|:------|:----------|:--------------|:-------------------|:---------------|:-------------|:------------|:---------|:------------|:---------------|:------------|:-----------|:-----------|:----------|:-------------|:--------------|:-----------|:-------------------|:---------|:-------------------|:----------------------------|:-----------------|:------------------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | | X | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 10 |  |  |  |  |  | | X | | | | | X | | | | | X | | X | | | | | | X | X | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | | X | | X | | | X | | | | | X | | | | | | X | | X | X | | | | | | | | | X | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | | X | | X | X | | X | | | | | | | | | | | | | X | | | | | | | | | | X | | | | | | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | | X | | | | | X | | | | | | | | | | | | X | X | | X | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 7 | 9 |  |  |  |  |  | | X | | | | | X | | | | | X | | X | X | | | | | X | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 8 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | | X | | | | X | X | X | | | | | X | | X | | | | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | | X | | | | X | X | X | | | |
| 9 | 6 |  |  |  |  |  | X | X | | | | X | X | | X | | | | | | X | | | | | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | X | | X | X | X |
|
TinyPixel/based_2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 58034
num_examples: 176
download_size: 31503
dataset_size: 58034
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "based_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/torisumi_horou_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of torisumi_horou/鳥澄珠烏 (Touhou)
This is the dataset of torisumi_horou/鳥澄珠烏 (Touhou), containing 23 images and their tags.
The core tags of this character are `multicolored_hair, white_hair, bow, short_hair, hat, red_bow, white_headwear, wings, yellow_eyes, black_hair, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 23 | 32.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/torisumi_horou_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 23 | 19.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/torisumi_horou_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 58 | 40.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/torisumi_horou_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 23 | 29.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/torisumi_horou_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 58 | 54.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/torisumi_horou_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/torisumi_horou_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, puffy_short_sleeves, solo, white_shirt, pink_vest, smile, closed_mouth, collared_shirt, looking_at_viewer, red_bowtie, red_socks, pink_shorts, book, frills, multicolored_wings, pink_skirt, shoes, white_background, white_footwear, belt, blush, full_body, rainbow_gradient, simple_background, test_tube |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | puffy_short_sleeves | solo | white_shirt | pink_vest | smile | closed_mouth | collared_shirt | looking_at_viewer | red_bowtie | red_socks | pink_shorts | book | frills | multicolored_wings | pink_skirt | shoes | white_background | white_footwear | belt | blush | full_body | rainbow_gradient | simple_background | test_tube |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------------------|:-------|:--------------|:------------|:--------|:---------------|:-----------------|:--------------------|:-------------|:------------|:--------------|:-------|:---------|:---------------------|:-------------|:--------|:-------------------|:-----------------|:-------|:--------|:------------|:-------------------|:--------------------|:------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
nlplabtdtu/summarization_sft_prompted | ---
language: vi
dataset_info:
features:
- name: summary
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 3857903
num_examples: 1000
- name: test
num_bytes: 781238
num_examples: 200
download_size: 2286819
dataset_size: 4639141
---
# Dataset Card for "tdtunlplab_news_summary_2_prompt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
learn3r/gov_report_memsum_bp | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 169706535
num_examples: 17457
- name: validation
num_bytes: 11085755
num_examples: 972
- name: test
num_bytes: 11134235
num_examples: 973
download_size: 87102306
dataset_size: 191926525
---
# Dataset Card for "gov_report_memsum_bp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jinawei/shadow-alignment-data | ---
license: apache-2.0
---
## Shadow-Alignment-Dataset |
ch08931/GabrielC | ---
license: openrail
---
|
JovialValley/broadclass_totaldataset_2 | ---
dataset_info:
features:
- name: name
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: label
dtype: string
- name: emotion
dtype: string
- name: emotion_str
dtype: string
splits:
- name: train
num_bytes: 163848386.0
num_examples: 390
- name: test
num_bytes: 40722720.0
num_examples: 97
download_size: 137727655
dataset_size: 204571106.0
---
# Dataset Card for "broadclass_totaldataset_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_nicholasKluge__Aira-2-124M | ---
pretty_name: Evaluation run of nicholasKluge/Aira-2-124M
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nicholasKluge/Aira-2-124M](https://huggingface.co/nicholasKluge/Aira-2-124M)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nicholasKluge__Aira-2-124M\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-26T00:58:54.483693](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-2-124M/blob/main/results_2023-08-26T00%3A58%3A54.483693.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2563784281118179,\n\
\ \"acc_stderr\": 0.03131922643477471,\n \"acc_norm\": 0.25747333491194596,\n\
\ \"acc_norm_stderr\": 0.03133423457395941,\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931583,\n \"mc2\": 0.39825983953563676,\n\
\ \"mc2_stderr\": 0.014916655527587098\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2030716723549488,\n \"acc_stderr\": 0.011755899303705582,\n\
\ \"acc_norm\": 0.2431740614334471,\n \"acc_norm_stderr\": 0.012536554144587094\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2907787293367855,\n\
\ \"acc_stderr\": 0.004531935391507024,\n \"acc_norm\": 0.3152758414658435,\n\
\ \"acc_norm_stderr\": 0.004636760762522853\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n\
\ \"acc_stderr\": 0.03633384414073461,\n \"acc_norm\": 0.22962962962962963,\n\
\ \"acc_norm_stderr\": 0.03633384414073461\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2490566037735849,\n \"acc_stderr\": 0.02661648298050171,\n\
\ \"acc_norm\": 0.2490566037735849,\n \"acc_norm_stderr\": 0.02661648298050171\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.32,\n\
\ \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24855491329479767,\n\
\ \"acc_stderr\": 0.03295304696818318,\n \"acc_norm\": 0.24855491329479767,\n\
\ \"acc_norm_stderr\": 0.03295304696818318\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2723404255319149,\n \"acc_stderr\": 0.0291012906983867,\n\
\ \"acc_norm\": 0.2723404255319149,\n \"acc_norm_stderr\": 0.0291012906983867\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131184,\n\
\ \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131184\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643895,\n \"\
acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643895\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15873015873015872,\n\
\ \"acc_stderr\": 0.03268454013011743,\n \"acc_norm\": 0.15873015873015872,\n\
\ \"acc_norm_stderr\": 0.03268454013011743\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.14,\n \"acc_stderr\": 0.03487350880197771,\n \
\ \"acc_norm\": 0.14,\n \"acc_norm_stderr\": 0.03487350880197771\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25806451612903225,\n\
\ \"acc_stderr\": 0.024892469172462826,\n \"acc_norm\": 0.25806451612903225,\n\
\ \"acc_norm_stderr\": 0.024892469172462826\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.21674876847290642,\n \"acc_stderr\": 0.028990331252516235,\n\
\ \"acc_norm\": 0.21674876847290642,\n \"acc_norm_stderr\": 0.028990331252516235\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.35353535353535354,\n \"acc_stderr\": 0.03406086723547153,\n \"\
acc_norm\": 0.35353535353535354,\n \"acc_norm_stderr\": 0.03406086723547153\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n\
\ \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3717948717948718,\n \"acc_stderr\": 0.02450347255711094,\n \
\ \"acc_norm\": 0.3717948717948718,\n \"acc_norm_stderr\": 0.02450347255711094\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24444444444444444,\n \"acc_stderr\": 0.026202766534652148,\n \
\ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.026202766534652148\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3025210084033613,\n \"acc_stderr\": 0.02983796238829193,\n \
\ \"acc_norm\": 0.3025210084033613,\n \"acc_norm_stderr\": 0.02983796238829193\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23841059602649006,\n \"acc_stderr\": 0.0347918557259966,\n \"\
acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.0347918557259966\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3486238532110092,\n \"acc_stderr\": 0.020431254090714328,\n \"\
acc_norm\": 0.3486238532110092,\n \"acc_norm_stderr\": 0.020431254090714328\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25980392156862747,\n\
\ \"acc_stderr\": 0.030778554678693264,\n \"acc_norm\": 0.25980392156862747,\n\
\ \"acc_norm_stderr\": 0.030778554678693264\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n\
\ \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.10762331838565023,\n\
\ \"acc_stderr\": 0.020799400082879997,\n \"acc_norm\": 0.10762331838565023,\n\
\ \"acc_norm_stderr\": 0.020799400082879997\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.21374045801526717,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.21374045801526717,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2231404958677686,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.2231404958677686,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2037037037037037,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.2037037037037037,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.24539877300613497,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n\
\ \"acc_stderr\": 0.040073418097558065,\n \"acc_norm\": 0.23214285714285715,\n\
\ \"acc_norm_stderr\": 0.040073418097558065\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3106796116504854,\n \"acc_stderr\": 0.0458212416016155,\n\
\ \"acc_norm\": 0.3106796116504854,\n \"acc_norm_stderr\": 0.0458212416016155\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.18803418803418803,\n\
\ \"acc_stderr\": 0.02559819368665225,\n \"acc_norm\": 0.18803418803418803,\n\
\ \"acc_norm_stderr\": 0.02559819368665225\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.227330779054917,\n\
\ \"acc_stderr\": 0.014987270640946015,\n \"acc_norm\": 0.227330779054917,\n\
\ \"acc_norm_stderr\": 0.014987270640946015\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.23410404624277456,\n \"acc_stderr\": 0.022797110278071138,\n\
\ \"acc_norm\": 0.23410404624277456,\n \"acc_norm_stderr\": 0.022797110278071138\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2679738562091503,\n \"acc_stderr\": 0.025360603796242553,\n\
\ \"acc_norm\": 0.2679738562091503,\n \"acc_norm_stderr\": 0.025360603796242553\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.19292604501607716,\n\
\ \"acc_stderr\": 0.022411516780911366,\n \"acc_norm\": 0.19292604501607716,\n\
\ \"acc_norm_stderr\": 0.022411516780911366\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.023132376234543343,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.023132376234543343\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2730496453900709,\n \"acc_stderr\": 0.026577860943307857,\n \
\ \"acc_norm\": 0.2730496453900709,\n \"acc_norm_stderr\": 0.026577860943307857\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2470664928292047,\n\
\ \"acc_stderr\": 0.011015752255279338,\n \"acc_norm\": 0.2470664928292047,\n\
\ \"acc_norm_stderr\": 0.011015752255279338\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2549019607843137,\n \"acc_stderr\": 0.017630827375148383,\n \
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.017630827375148383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.23636363636363636,\n\
\ \"acc_stderr\": 0.040693063197213754,\n \"acc_norm\": 0.23636363636363636,\n\
\ \"acc_norm_stderr\": 0.040693063197213754\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.031362502409358936,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.031362502409358936\n \
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.030360490154014638,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.030360490154014638\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.19879518072289157,\n\
\ \"acc_stderr\": 0.031069390260789437,\n \"acc_norm\": 0.19879518072289157,\n\
\ \"acc_norm_stderr\": 0.031069390260789437\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.03615507630310935,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03615507630310935\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931583,\n \"mc2\": 0.39825983953563676,\n\
\ \"mc2_stderr\": 0.014916655527587098\n }\n}\n```"
repo_url: https://huggingface.co/nicholasKluge/Aira-2-124M
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|arc:challenge|25_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hellaswag|10_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T00:58:54.483693.parquet'
- config_name: results
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- results_2023-08-26T00:58:54.483693.parquet
- split: latest
path:
- results_2023-08-26T00:58:54.483693.parquet
---
# Dataset Card for Evaluation run of nicholasKluge/Aira-2-124M
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/nicholasKluge/Aira-2-124M
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [nicholasKluge/Aira-2-124M](https://huggingface.co/nicholasKluge/Aira-2-124M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nicholasKluge__Aira-2-124M",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-26T00:58:54.483693](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-2-124M/blob/main/results_2023-08-26T00%3A58%3A54.483693.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2563784281118179,
"acc_stderr": 0.03131922643477471,
"acc_norm": 0.25747333491194596,
"acc_norm_stderr": 0.03133423457395941,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931583,
"mc2": 0.39825983953563676,
"mc2_stderr": 0.014916655527587098
},
"harness|arc:challenge|25": {
"acc": 0.2030716723549488,
"acc_stderr": 0.011755899303705582,
"acc_norm": 0.2431740614334471,
"acc_norm_stderr": 0.012536554144587094
},
"harness|hellaswag|10": {
"acc": 0.2907787293367855,
"acc_stderr": 0.004531935391507024,
"acc_norm": 0.3152758414658435,
"acc_norm_stderr": 0.004636760762522853
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073461,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073461
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2490566037735849,
"acc_stderr": 0.02661648298050171,
"acc_norm": 0.2490566037735849,
"acc_norm_stderr": 0.02661648298050171
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.03295304696818318,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.03295304696818318
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2723404255319149,
"acc_stderr": 0.0291012906983867,
"acc_norm": 0.2723404255319149,
"acc_norm_stderr": 0.0291012906983867
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25132275132275134,
"acc_stderr": 0.022340482339643895,
"acc_norm": 0.25132275132275134,
"acc_norm_stderr": 0.022340482339643895
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15873015873015872,
"acc_stderr": 0.03268454013011743,
"acc_norm": 0.15873015873015872,
"acc_norm_stderr": 0.03268454013011743
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.14,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.14,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25806451612903225,
"acc_stderr": 0.024892469172462826,
"acc_norm": 0.25806451612903225,
"acc_norm_stderr": 0.024892469172462826
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.21674876847290642,
"acc_stderr": 0.028990331252516235,
"acc_norm": 0.21674876847290642,
"acc_norm_stderr": 0.028990331252516235
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35353535353535354,
"acc_stderr": 0.03406086723547153,
"acc_norm": 0.35353535353535354,
"acc_norm_stderr": 0.03406086723547153
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.36787564766839376,
"acc_stderr": 0.03480175668466036,
"acc_norm": 0.36787564766839376,
"acc_norm_stderr": 0.03480175668466036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3717948717948718,
"acc_stderr": 0.02450347255711094,
"acc_norm": 0.3717948717948718,
"acc_norm_stderr": 0.02450347255711094
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.026202766534652148,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.026202766534652148
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3025210084033613,
"acc_stderr": 0.02983796238829193,
"acc_norm": 0.3025210084033613,
"acc_norm_stderr": 0.02983796238829193
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.0347918557259966,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.0347918557259966
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3486238532110092,
"acc_stderr": 0.020431254090714328,
"acc_norm": 0.3486238532110092,
"acc_norm_stderr": 0.020431254090714328
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.030778554678693264,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.030778554678693264
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.10762331838565023,
"acc_stderr": 0.020799400082879997,
"acc_norm": 0.10762331838565023,
"acc_norm_stderr": 0.020799400082879997
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.21374045801526717,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.21374045801526717,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2231404958677686,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.2231404958677686,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.24539877300613497,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.24539877300613497,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.23214285714285715,
"acc_stderr": 0.040073418097558065,
"acc_norm": 0.23214285714285715,
"acc_norm_stderr": 0.040073418097558065
},
"harness|hendrycksTest-management|5": {
"acc": 0.3106796116504854,
"acc_stderr": 0.0458212416016155,
"acc_norm": 0.3106796116504854,
"acc_norm_stderr": 0.0458212416016155
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.18803418803418803,
"acc_stderr": 0.02559819368665225,
"acc_norm": 0.18803418803418803,
"acc_norm_stderr": 0.02559819368665225
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.227330779054917,
"acc_stderr": 0.014987270640946015,
"acc_norm": 0.227330779054917,
"acc_norm_stderr": 0.014987270640946015
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23410404624277456,
"acc_stderr": 0.022797110278071138,
"acc_norm": 0.23410404624277456,
"acc_norm_stderr": 0.022797110278071138
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2679738562091503,
"acc_stderr": 0.025360603796242553,
"acc_norm": 0.2679738562091503,
"acc_norm_stderr": 0.025360603796242553
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.19292604501607716,
"acc_stderr": 0.022411516780911366,
"acc_norm": 0.19292604501607716,
"acc_norm_stderr": 0.022411516780911366
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.023132376234543343,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.023132376234543343
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2730496453900709,
"acc_stderr": 0.026577860943307857,
"acc_norm": 0.2730496453900709,
"acc_norm_stderr": 0.026577860943307857
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2470664928292047,
"acc_stderr": 0.011015752255279338,
"acc_norm": 0.2470664928292047,
"acc_norm_stderr": 0.011015752255279338
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.017630827375148383,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.017630827375148383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.040693063197213754,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.040693063197213754
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4,
"acc_stderr": 0.031362502409358936,
"acc_norm": 0.4,
"acc_norm_stderr": 0.031362502409358936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014638,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014638
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-virology|5": {
"acc": 0.19879518072289157,
"acc_stderr": 0.031069390260789437,
"acc_norm": 0.19879518072289157,
"acc_norm_stderr": 0.031069390260789437
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03615507630310935,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03615507630310935
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931583,
"mc2": 0.39825983953563676,
"mc2_stderr": 0.014916655527587098
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
gizemgg/eunews-eng | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1741935277
num_examples: 589938
- name: test
num_bytes: 438103409
num_examples: 147484
download_size: 827642652
dataset_size: 2180038686
---
# Dataset Card for "eunews-eng"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wannaphong/thai_sample_500k | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2878877988
num_examples: 500000
download_size: 1128997330
dataset_size: 2878877988
---
# Dataset Card for "thai_sample_500k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
eb/num50000test | ---
dataset_info:
features:
- name: text
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 67123858.5
num_examples: 45000
- name: test
num_bytes: 7458206.5
num_examples: 5000
download_size: 42801996
dataset_size: 74582065.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
hssd/hssd-scenes | ---
language:
- en
pretty_name: HSSD
tags:
- 3D scenes
- Embodied AI
license: cc-by-nc-4.0
extra_gated_heading: "Acknowledge license to accept the repository"
extra_gated_prompt: "You agree to use this dataset under the [CC BY-NC 4.0 license](https://creativecommons.org/licenses/by-nc/4.0/) terms"
---
HSSD: Habitat Synthetic Scenes Dataset
==================================
The [Habitat Synthetic Scenes Dataset (HSSD)](https://3dlg-hcvc.github.io/hssd/) is a human-authored 3D scene dataset that more closely mirrors real scenes than prior datasets.
Our dataset represents real interiors and contains a diverse set of 211 scenes and more than 18000 models of real-world objects.
<img src="https://i.imgur.com/XEkLxNs.png" width=50%>
|
open-llm-leaderboard/details_ankhamun__xxxI-Ixxx | ---
pretty_name: Evaluation run of ankhamun/xxxI-Ixxx
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ankhamun/xxxI-Ixxx](https://huggingface.co/ankhamun/xxxI-Ixxx) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ankhamun__xxxI-Ixxx\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-09T19:25:58.917913](https://huggingface.co/datasets/open-llm-leaderboard/details_ankhamun__xxxI-Ixxx/blob/main/results_2024-02-09T19-25-58.917913.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5185710776579808,\n\
\ \"acc_stderr\": 0.034251914485577906,\n \"acc_norm\": 0.5240726925248631,\n\
\ \"acc_norm_stderr\": 0.03498635392452543,\n \"mc1\": 0.3806609547123623,\n\
\ \"mc1_stderr\": 0.016997627871907926,\n \"mc2\": 0.5442191956457653,\n\
\ \"mc2_stderr\": 0.01519663174796153\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4931740614334471,\n \"acc_stderr\": 0.014610029151379813,\n\
\ \"acc_norm\": 0.5418088737201365,\n \"acc_norm_stderr\": 0.014560220308714695\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5446126269667397,\n\
\ \"acc_stderr\": 0.004969879532843072,\n \"acc_norm\": 0.7254530969926309,\n\
\ \"acc_norm_stderr\": 0.00445373590094783\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.040335656678483205,\n\
\ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.040335656678483205\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5773584905660377,\n \"acc_stderr\": 0.03040233144576954,\n\
\ \"acc_norm\": 0.5773584905660377,\n \"acc_norm_stderr\": 0.03040233144576954\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04155319955593146,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04155319955593146\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4797687861271676,\n\
\ \"acc_stderr\": 0.03809342081273958,\n \"acc_norm\": 0.4797687861271676,\n\
\ \"acc_norm_stderr\": 0.03809342081273958\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179328,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179328\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4297872340425532,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.4297872340425532,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37037037037037035,\n \"acc_stderr\": 0.02487081525105709,\n \"\
acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02487081525105709\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04006168083848879,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04006168083848879\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.027869320571664632,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.027869320571664632\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.03465304488406796,\n\
\ \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.03465304488406796\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6303030303030303,\n \"acc_stderr\": 0.03769430314512567,\n\
\ \"acc_norm\": 0.6303030303030303,\n \"acc_norm_stderr\": 0.03769430314512567\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6464646464646465,\n \"acc_stderr\": 0.03406086723547155,\n \"\
acc_norm\": 0.6464646464646465,\n \"acc_norm_stderr\": 0.03406086723547155\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.689119170984456,\n \"acc_stderr\": 0.03340361906276586,\n\
\ \"acc_norm\": 0.689119170984456,\n \"acc_norm_stderr\": 0.03340361906276586\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4948717948717949,\n \"acc_stderr\": 0.02534967290683866,\n \
\ \"acc_norm\": 0.4948717948717949,\n \"acc_norm_stderr\": 0.02534967290683866\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228405,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228405\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.4789915966386555,\n \"acc_stderr\": 0.03244980849990029,\n\
\ \"acc_norm\": 0.4789915966386555,\n \"acc_norm_stderr\": 0.03244980849990029\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.0386155754625517,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.0386155754625517\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7009174311926606,\n \"acc_stderr\": 0.019630417285415182,\n \"\
acc_norm\": 0.7009174311926606,\n \"acc_norm_stderr\": 0.019630417285415182\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3287037037037037,\n \"acc_stderr\": 0.032036140846700596,\n \"\
acc_norm\": 0.3287037037037037,\n \"acc_norm_stderr\": 0.032036140846700596\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7009803921568627,\n \"acc_stderr\": 0.03213325717373618,\n \"\
acc_norm\": 0.7009803921568627,\n \"acc_norm_stderr\": 0.03213325717373618\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6877637130801688,\n \"acc_stderr\": 0.030165137867847004,\n \
\ \"acc_norm\": 0.6877637130801688,\n \"acc_norm_stderr\": 0.030165137867847004\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5695067264573991,\n\
\ \"acc_stderr\": 0.033231973029429394,\n \"acc_norm\": 0.5695067264573991,\n\
\ \"acc_norm_stderr\": 0.033231973029429394\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6776859504132231,\n \"acc_stderr\": 0.04266416363352168,\n \"\
acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.04266416363352168\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6380368098159509,\n \"acc_stderr\": 0.037757007291414416,\n\
\ \"acc_norm\": 0.6380368098159509,\n \"acc_norm_stderr\": 0.037757007291414416\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.045821241601615506,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.045821241601615506\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7735042735042735,\n\
\ \"acc_stderr\": 0.027421007295392923,\n \"acc_norm\": 0.7735042735042735,\n\
\ \"acc_norm_stderr\": 0.027421007295392923\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956914,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956914\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7126436781609196,\n\
\ \"acc_stderr\": 0.0161824107306827,\n \"acc_norm\": 0.7126436781609196,\n\
\ \"acc_norm_stderr\": 0.0161824107306827\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.026424816594009845,\n\
\ \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.026424816594009845\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28268156424581004,\n\
\ \"acc_stderr\": 0.0150603817300181,\n \"acc_norm\": 0.28268156424581004,\n\
\ \"acc_norm_stderr\": 0.0150603817300181\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5261437908496732,\n \"acc_stderr\": 0.028590752958852394,\n\
\ \"acc_norm\": 0.5261437908496732,\n \"acc_norm_stderr\": 0.028590752958852394\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6237942122186495,\n\
\ \"acc_stderr\": 0.027513925683549434,\n \"acc_norm\": 0.6237942122186495,\n\
\ \"acc_norm_stderr\": 0.027513925683549434\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.595679012345679,\n \"acc_stderr\": 0.02730662529732768,\n\
\ \"acc_norm\": 0.595679012345679,\n \"acc_norm_stderr\": 0.02730662529732768\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3404255319148936,\n \"acc_stderr\": 0.028267657482650144,\n \
\ \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.028267657482650144\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3709256844850065,\n\
\ \"acc_stderr\": 0.01233739168453031,\n \"acc_norm\": 0.3709256844850065,\n\
\ \"acc_norm_stderr\": 0.01233739168453031\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.029896163033125468,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.029896163033125468\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5147058823529411,\n \"acc_stderr\": 0.020219083895133924,\n \
\ \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.020219083895133924\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\
\ \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n\
\ \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5346938775510204,\n \"acc_stderr\": 0.03193207024425314,\n\
\ \"acc_norm\": 0.5346938775510204,\n \"acc_norm_stderr\": 0.03193207024425314\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6965174129353234,\n\
\ \"acc_stderr\": 0.032510068164586195,\n \"acc_norm\": 0.6965174129353234,\n\
\ \"acc_norm_stderr\": 0.032510068164586195\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7309941520467836,\n \"acc_stderr\": 0.03401052620104089,\n\
\ \"acc_norm\": 0.7309941520467836,\n \"acc_norm_stderr\": 0.03401052620104089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3806609547123623,\n\
\ \"mc1_stderr\": 0.016997627871907926,\n \"mc2\": 0.5442191956457653,\n\
\ \"mc2_stderr\": 0.01519663174796153\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7024467245461721,\n \"acc_stderr\": 0.012849085254614659\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2395754359363154,\n \
\ \"acc_stderr\": 0.01175686434407741\n }\n}\n```"
repo_url: https://huggingface.co/ankhamun/xxxI-Ixxx
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|arc:challenge|25_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|gsm8k|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hellaswag|10_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T19-25-58.917913.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T19-25-58.917913.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- '**/details_harness|winogrande|5_2024-02-09T19-25-58.917913.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-09T19-25-58.917913.parquet'
- config_name: results
data_files:
- split: 2024_02_09T19_25_58.917913
path:
- results_2024-02-09T19-25-58.917913.parquet
- split: latest
path:
- results_2024-02-09T19-25-58.917913.parquet
---
# Dataset Card for Evaluation run of ankhamun/xxxI-Ixxx
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ankhamun/xxxI-Ixxx](https://huggingface.co/ankhamun/xxxI-Ixxx) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ankhamun__xxxI-Ixxx",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-09T19:25:58.917913](https://huggingface.co/datasets/open-llm-leaderboard/details_ankhamun__xxxI-Ixxx/blob/main/results_2024-02-09T19-25-58.917913.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5185710776579808,
"acc_stderr": 0.034251914485577906,
"acc_norm": 0.5240726925248631,
"acc_norm_stderr": 0.03498635392452543,
"mc1": 0.3806609547123623,
"mc1_stderr": 0.016997627871907926,
"mc2": 0.5442191956457653,
"mc2_stderr": 0.01519663174796153
},
"harness|arc:challenge|25": {
"acc": 0.4931740614334471,
"acc_stderr": 0.014610029151379813,
"acc_norm": 0.5418088737201365,
"acc_norm_stderr": 0.014560220308714695
},
"harness|hellaswag|10": {
"acc": 0.5446126269667397,
"acc_stderr": 0.004969879532843072,
"acc_norm": 0.7254530969926309,
"acc_norm_stderr": 0.00445373590094783
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.040335656678483205,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.040335656678483205
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5773584905660377,
"acc_stderr": 0.03040233144576954,
"acc_norm": 0.5773584905660377,
"acc_norm_stderr": 0.03040233144576954
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04155319955593146,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04155319955593146
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4797687861271676,
"acc_stderr": 0.03809342081273958,
"acc_norm": 0.4797687861271676,
"acc_norm_stderr": 0.03809342081273958
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179328,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179328
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4297872340425532,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.4297872340425532,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02487081525105709,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02487081525105709
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04006168083848879,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04006168083848879
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6,
"acc_stderr": 0.027869320571664632,
"acc_norm": 0.6,
"acc_norm_stderr": 0.027869320571664632
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.03465304488406796,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.03465304488406796
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6303030303030303,
"acc_stderr": 0.03769430314512567,
"acc_norm": 0.6303030303030303,
"acc_norm_stderr": 0.03769430314512567
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6464646464646465,
"acc_stderr": 0.03406086723547155,
"acc_norm": 0.6464646464646465,
"acc_norm_stderr": 0.03406086723547155
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.689119170984456,
"acc_stderr": 0.03340361906276586,
"acc_norm": 0.689119170984456,
"acc_norm_stderr": 0.03340361906276586
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4948717948717949,
"acc_stderr": 0.02534967290683866,
"acc_norm": 0.4948717948717949,
"acc_norm_stderr": 0.02534967290683866
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228405,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228405
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4789915966386555,
"acc_stderr": 0.03244980849990029,
"acc_norm": 0.4789915966386555,
"acc_norm_stderr": 0.03244980849990029
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.0386155754625517,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.0386155754625517
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7009174311926606,
"acc_stderr": 0.019630417285415182,
"acc_norm": 0.7009174311926606,
"acc_norm_stderr": 0.019630417285415182
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3287037037037037,
"acc_stderr": 0.032036140846700596,
"acc_norm": 0.3287037037037037,
"acc_norm_stderr": 0.032036140846700596
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7009803921568627,
"acc_stderr": 0.03213325717373618,
"acc_norm": 0.7009803921568627,
"acc_norm_stderr": 0.03213325717373618
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6877637130801688,
"acc_stderr": 0.030165137867847004,
"acc_norm": 0.6877637130801688,
"acc_norm_stderr": 0.030165137867847004
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5695067264573991,
"acc_stderr": 0.033231973029429394,
"acc_norm": 0.5695067264573991,
"acc_norm_stderr": 0.033231973029429394
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6776859504132231,
"acc_stderr": 0.04266416363352168,
"acc_norm": 0.6776859504132231,
"acc_norm_stderr": 0.04266416363352168
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04557239513497751,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04557239513497751
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6380368098159509,
"acc_stderr": 0.037757007291414416,
"acc_norm": 0.6380368098159509,
"acc_norm_stderr": 0.037757007291414416
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.045821241601615506,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.045821241601615506
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7735042735042735,
"acc_stderr": 0.027421007295392923,
"acc_norm": 0.7735042735042735,
"acc_norm_stderr": 0.027421007295392923
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956914,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956914
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7126436781609196,
"acc_stderr": 0.0161824107306827,
"acc_norm": 0.7126436781609196,
"acc_norm_stderr": 0.0161824107306827
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.026424816594009845,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.026424816594009845
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.28268156424581004,
"acc_stderr": 0.0150603817300181,
"acc_norm": 0.28268156424581004,
"acc_norm_stderr": 0.0150603817300181
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5261437908496732,
"acc_stderr": 0.028590752958852394,
"acc_norm": 0.5261437908496732,
"acc_norm_stderr": 0.028590752958852394
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6237942122186495,
"acc_stderr": 0.027513925683549434,
"acc_norm": 0.6237942122186495,
"acc_norm_stderr": 0.027513925683549434
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.595679012345679,
"acc_stderr": 0.02730662529732768,
"acc_norm": 0.595679012345679,
"acc_norm_stderr": 0.02730662529732768
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3404255319148936,
"acc_stderr": 0.028267657482650144,
"acc_norm": 0.3404255319148936,
"acc_norm_stderr": 0.028267657482650144
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3709256844850065,
"acc_stderr": 0.01233739168453031,
"acc_norm": 0.3709256844850065,
"acc_norm_stderr": 0.01233739168453031
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.029896163033125468,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.029896163033125468
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.020219083895133924,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.020219083895133924
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5346938775510204,
"acc_stderr": 0.03193207024425314,
"acc_norm": 0.5346938775510204,
"acc_norm_stderr": 0.03193207024425314
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6965174129353234,
"acc_stderr": 0.032510068164586195,
"acc_norm": 0.6965174129353234,
"acc_norm_stderr": 0.032510068164586195
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7309941520467836,
"acc_stderr": 0.03401052620104089,
"acc_norm": 0.7309941520467836,
"acc_norm_stderr": 0.03401052620104089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3806609547123623,
"mc1_stderr": 0.016997627871907926,
"mc2": 0.5442191956457653,
"mc2_stderr": 0.01519663174796153
},
"harness|winogrande|5": {
"acc": 0.7024467245461721,
"acc_stderr": 0.012849085254614659
},
"harness|gsm8k|5": {
"acc": 0.2395754359363154,
"acc_stderr": 0.01175686434407741
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
deokhk/te_wiki_sentences_1000000 | ---
dataset_info:
features:
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 189344044
num_examples: 1000000
- name: dev
num_bytes: 166164
num_examples: 1000
download_size: 43341997
dataset_size: 189510208
---
# Dataset Card for "te_wiki_sentences_1000000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
unreal-hug/REAL_DATASET_SEG_401_9_lbls | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 108706033.0
num_examples: 401
download_size: 7968686
dataset_size: 108706033.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nitinbhayana/title_reverse_ner | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 44105
num_examples: 134
download_size: 28152
dataset_size: 44105
---
# Dataset Card for "title_reverse_ner"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
brettbbb/vicuna_qa_causal_LM_split | ---
dataset_info:
features:
- name: question
dtype: string
- name: mc1_targets
struct:
- name: choices
sequence: string
- name: labels
sequence: int32
- name: mc2_targets
struct:
- name: choices
sequence: string
- name: labels
sequence: int32
splits:
- name: train
num_bytes: 486818.29375764995
num_examples: 653
- name: test
num_bytes: 122263.70624235006
num_examples: 164
download_size: 280226
dataset_size: 609082.0
---
# Dataset Card for "vicuna_qa_causal_LM_split"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ManuelAlv/academic_conuseling | ---
configs:
- config_name: default
data_files:
- split: dataset
path: data/dataset-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: dataset
num_bytes: 7496
num_examples: 25
- name: test
num_bytes: 1278
num_examples: 13
download_size: 10438
dataset_size: 8774
---
# Dataset Card for "academic_conuseling"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mahdibaghbanzadeh/GUE_EMP_H4ac | ---
dataset_info:
features:
- name: sequence
dtype: string
- name: labels
dtype:
class_label:
names:
'0': '0'
'1': '1'
splits:
- name: train
num_bytes: 13964590
num_examples: 27275
- name: val
num_bytes: 1745869
num_examples: 3410
- name: test
num_bytes: 1745920
num_examples: 3410
download_size: 8236992
dataset_size: 17456379
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
---
|
Nexdata/189_Videos_Electric_Bicycle_Entering_Elevator_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
189 Videos-Electric Bicycle Entering Elevator Data,the total duration is 1 hour 58 minutes 40.72 seconds. The data covers different types of elevators, different types of electric bicycles, different time periods. The data can be used for tasks such as electric bicycle detection, electric bicycle recognition.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1136?source=Huggingface
## Data size
189 videos, the total duration is 1 hour 58 minutes 40.72 seconds
## Collecting environment
indoor scenes
## Data diversity
different types of elevators, different types of non-electric bicycles, different types of electric bicycles, different time periods
## Device
surveillance cameras
## Data format
.mp4
## Accuracy
the accuracy of label of vehicle type is more than 97%
# Licensing Information
Commercial License
|
shreevigneshs/iwslt-2023-en-ru-train-val-split-0.2 | ---
dataset_info:
features:
- name: en
dtype: string
- name: ru
dtype: string
- name: ru_annotated
dtype: string
- name: styles
dtype: int64
splits:
- name: if_test
num_bytes: 327410
num_examples: 600
- name: f_test
num_bytes: 327839
num_examples: 600
- name: f_flores
num_bytes: 414702
num_examples: 1012
- name: if_flores
num_bytes: 414702
num_examples: 1012
download_size: 836846
dataset_size: 1484653
language:
- ru
- en
---
# Dataset Card for "iwslt-2023-en-ru-train-val-split-0.2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
johannes-garstenauer/ENN_class_embeddings_dim_512 | ---
dataset_info:
features:
- name: last_hs
sequence: float32
- name: label
dtype: int64
splits:
- name: train
num_bytes: 138580320
num_examples: 67272
download_size: 167196918
dataset_size: 138580320
---
# Dataset Card for "ENN_class_embeddings_dim_512"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/101_People_4538_Images_Japanese_Handwriting_OCR_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
101 People - 4,538 Images Japanese Handwriting OCR Data. The text carrier is A4 paper. The dataset content includes social livelihood, entertainment, tour, sport, movie, composition and other fields. For annotation, character-level rectangular bounding box annotation and text transcription were adopted. The dataset can be used for tasks such as Japanese handwriting OCR.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1087?source=Huggingface
## Data size
101 people, 4,538 images
## Collecting environment
A4 paper
## Device
scanner
## Photographic angle
eye-level angle
## Data format
the image data format is .jpg, the annotation file format is .json
## Data content
including social livelihood, entertainment, tour, sport, movie, composition and other fields
## Annotation content
character-level rectangular bounding box annotation and text transcription
## Accuracy
the error bound of each vertex of rectangular bounding box is within 2 pixels, which is a qualified annotation, the accuracy of bounding boxes is not less than 98%; the characters transcription accuracy is not less than 98%
# Licensing Information
Commercial License
|
fusing/instructpix2pix-1000-samples | ---
dataset_info:
features:
- name: input_image
dtype: image
- name: edit_prompt
dtype: string
- name: edited_image
dtype: image
splits:
- name: train
num_bytes: 416880759.0
num_examples: 1000
download_size: 416899514
dataset_size: 416880759.0
---
# Dataset Card for "instructpix2pix-1000-samples"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
The dataset was created using the code from [this repository](https://github.com/sayakpaul/instruct-pix2pix-dataset). |
open-llm-leaderboard/details_Nitral-AI__Eris_PrimeV4.20-Vision-32k-7B | ---
pretty_name: Evaluation run of Nitral-AI/Eris_PrimeV4.20-Vision-32k-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Nitral-AI/Eris_PrimeV4.20-Vision-32k-7B](https://huggingface.co/Nitral-AI/Eris_PrimeV4.20-Vision-32k-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Nitral-AI__Eris_PrimeV4.20-Vision-32k-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-29T21:25:00.837126](https://huggingface.co/datasets/open-llm-leaderboard/details_Nitral-AI__Eris_PrimeV4.20-Vision-32k-7B/blob/main/results_2024-03-29T21-25-00.837126.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6384785420614574,\n\
\ \"acc_stderr\": 0.032470197644124336,\n \"acc_norm\": 0.6409023014166276,\n\
\ \"acc_norm_stderr\": 0.03312260754991937,\n \"mc1\": 0.3574051407588739,\n\
\ \"mc1_stderr\": 0.016776599676729405,\n \"mc2\": 0.5253069758264901,\n\
\ \"mc2_stderr\": 0.015295427525749042\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6109215017064846,\n \"acc_stderr\": 0.014247309976045607,\n\
\ \"acc_norm\": 0.6493174061433447,\n \"acc_norm_stderr\": 0.013944635930726097\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6539533957379008,\n\
\ \"acc_stderr\": 0.004747360500742481,\n \"acc_norm\": 0.8480382393945429,\n\
\ \"acc_norm_stderr\": 0.0035825015965645518\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n\
\ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n\
\ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4365079365079365,\n \"acc_stderr\": 0.0255428468174005,\n \"acc_norm\"\
: 0.4365079365079365,\n \"acc_norm_stderr\": 0.0255428468174005\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7387096774193549,\n \"acc_stderr\": 0.024993053397764815,\n \"\
acc_norm\": 0.7387096774193549,\n \"acc_norm_stderr\": 0.024993053397764815\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758723,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758723\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6384615384615384,\n \"acc_stderr\": 0.024359581465396987,\n\
\ \"acc_norm\": 0.6384615384615384,\n \"acc_norm_stderr\": 0.024359581465396987\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473075,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135353,\n\
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135353\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507337,\n \"\
acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507337\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.02732547096671631,\n\
\ \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.02732547096671631\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909476,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909476\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n\
\ \"acc_stderr\": 0.014108533515757431,\n \"acc_norm\": 0.8071519795657727,\n\
\ \"acc_norm_stderr\": 0.014108533515757431\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323374,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323374\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4245810055865922,\n\
\ \"acc_stderr\": 0.016531170993278888,\n \"acc_norm\": 0.4245810055865922,\n\
\ \"acc_norm_stderr\": 0.016531170993278888\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.024739981355113592,\n\
\ \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.024739981355113592\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.02517104191530968,\n\
\ \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.02517104191530968\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46284224250325945,\n\
\ \"acc_stderr\": 0.01273492357953207,\n \"acc_norm\": 0.46284224250325945,\n\
\ \"acc_norm_stderr\": 0.01273492357953207\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507208,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507208\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399677,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7014925373134329,\n\
\ \"acc_stderr\": 0.03235743789355042,\n \"acc_norm\": 0.7014925373134329,\n\
\ \"acc_norm_stderr\": 0.03235743789355042\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3574051407588739,\n\
\ \"mc1_stderr\": 0.016776599676729405,\n \"mc2\": 0.5253069758264901,\n\
\ \"mc2_stderr\": 0.015295427525749042\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7947908445146015,\n \"acc_stderr\": 0.011350315707462049\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5716451857467779,\n \
\ \"acc_stderr\": 0.013630362189382147\n }\n}\n```"
repo_url: https://huggingface.co/Nitral-AI/Eris_PrimeV4.20-Vision-32k-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|arc:challenge|25_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|gsm8k|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hellaswag|10_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-25-00.837126.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T21-25-00.837126.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- '**/details_harness|winogrande|5_2024-03-29T21-25-00.837126.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-29T21-25-00.837126.parquet'
- config_name: results
data_files:
- split: 2024_03_29T21_25_00.837126
path:
- results_2024-03-29T21-25-00.837126.parquet
- split: latest
path:
- results_2024-03-29T21-25-00.837126.parquet
---
# Dataset Card for Evaluation run of Nitral-AI/Eris_PrimeV4.20-Vision-32k-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Nitral-AI/Eris_PrimeV4.20-Vision-32k-7B](https://huggingface.co/Nitral-AI/Eris_PrimeV4.20-Vision-32k-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Nitral-AI__Eris_PrimeV4.20-Vision-32k-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-29T21:25:00.837126](https://huggingface.co/datasets/open-llm-leaderboard/details_Nitral-AI__Eris_PrimeV4.20-Vision-32k-7B/blob/main/results_2024-03-29T21-25-00.837126.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6384785420614574,
"acc_stderr": 0.032470197644124336,
"acc_norm": 0.6409023014166276,
"acc_norm_stderr": 0.03312260754991937,
"mc1": 0.3574051407588739,
"mc1_stderr": 0.016776599676729405,
"mc2": 0.5253069758264901,
"mc2_stderr": 0.015295427525749042
},
"harness|arc:challenge|25": {
"acc": 0.6109215017064846,
"acc_stderr": 0.014247309976045607,
"acc_norm": 0.6493174061433447,
"acc_norm_stderr": 0.013944635930726097
},
"harness|hellaswag|10": {
"acc": 0.6539533957379008,
"acc_stderr": 0.004747360500742481,
"acc_norm": 0.8480382393945429,
"acc_norm_stderr": 0.0035825015965645518
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.0255428468174005,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.0255428468174005
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7387096774193549,
"acc_stderr": 0.024993053397764815,
"acc_norm": 0.7387096774193549,
"acc_norm_stderr": 0.024993053397764815
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758723,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6384615384615384,
"acc_stderr": 0.024359581465396987,
"acc_norm": 0.6384615384615384,
"acc_norm_stderr": 0.024359581465396987
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473075,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135353,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135353
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.01599015488507337,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.01599015488507337
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.02732547096671631,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.02732547096671631
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909476,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909476
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757431,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757431
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323374,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323374
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4245810055865922,
"acc_stderr": 0.016531170993278888,
"acc_norm": 0.4245810055865922,
"acc_norm_stderr": 0.016531170993278888
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.024739981355113592,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.024739981355113592
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.02517104191530968,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.02517104191530968
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46284224250325945,
"acc_stderr": 0.01273492357953207,
"acc_norm": 0.46284224250325945,
"acc_norm_stderr": 0.01273492357953207
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507208,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507208
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399677,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7014925373134329,
"acc_stderr": 0.03235743789355042,
"acc_norm": 0.7014925373134329,
"acc_norm_stderr": 0.03235743789355042
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3574051407588739,
"mc1_stderr": 0.016776599676729405,
"mc2": 0.5253069758264901,
"mc2_stderr": 0.015295427525749042
},
"harness|winogrande|5": {
"acc": 0.7947908445146015,
"acc_stderr": 0.011350315707462049
},
"harness|gsm8k|5": {
"acc": 0.5716451857467779,
"acc_stderr": 0.013630362189382147
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
autoevaluate/autoeval-staging-eval-project-ab647f27-7704968 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- masakhaner
eval_info:
task: entity_extraction
model: mbeukman/xlm-roberta-base-finetuned-ner-yoruba
metrics: []
dataset_name: masakhaner
dataset_config: yor
dataset_split: test
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: mbeukman/xlm-roberta-base-finetuned-ner-yoruba
* Dataset: masakhaner
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
kingsley9494/ks | ---
license: bigscience-openrail-m
---
|
tyzhu/rareid_find_second_sent_train_100_eval_10 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 258132
num_examples: 210
- name: validation
num_bytes: 10381
num_examples: 10
download_size: 130910
dataset_size: 268513
---
# Dataset Card for "rareid_find_second_sent_train_100_eval_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/10114_People_Multi_view_Tracking_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
This data is a multi-view tracking data of 10,114 people in surveillance scenes. Surveillance scenes includes indoor and outdoor scenes. The data includes men and women of different ages. In terms of annotation, the human body bounding boxes, human body + riding object bounding boxes, and 21 human body attributes of tracking objects were annotated. This data can be used for human body multi-view tracking, Re-ID and other tasks.
For more details, please refer to the link: https://www.nexdata.ai/dataset/965?source=Huggingface
# Specifications
## Data size
10,114 people
## Population distribution
gender distribution: 4,198 males, 5,916 females; age distribution: children(838 people), students(1,197 people), the youth(5,336 people), middle age (2,363 people), the old (378 people), unsure(2 people)
## Collection environment
surveillance scenes, including indoor scenes and outdoor scenes
## Collection diversity
different light conditions, different scenes, different routes
## Collection device
surveillance camera; photographic angles: looking down angle
## Collection time
day, sunset
## Image parameters
resolution:1,920x1,080, format: .jpg
## Annotation
rectangular bounding boxes of human body; rectangular bounding boxes of human body + riding object; 21 human body attributes
## Accuracy
annotation accuracy of rectangular bounding boxes is over 95%; annotation accuracy of human body attributes is over 95%
# Licensing Information
Commercial License
|
Tristan/olm-CC-MAIN-2022-40-sampling-ratio-0.15894621295-perplexity-filters | ---
dataset_info:
features:
- name: text
dtype: string
- name: url
dtype: string
- name: crawl_timestamp
dtype: float64
- name: kenlm_ppl
dtype: float64
splits:
- name: train
num_bytes: 33197245533.0
num_examples: 14558171
download_size: 20748879886
dataset_size: 33197245533.0
---
# Dataset Card for "olm-CC-MAIN-2022-40-sampling-ratio-0.15894621295-perplexity-filters"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
makaveli10/augmented-shrutilipi | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 28188508592.0
num_examples: 40000
download_size: 28080609408
dataset_size: 28188508592.0
---
# Dataset Card for "augmented-shrutilipi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
koun/myck | ---
license: afl-3.0
---
|
CyberHarem/colorado_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of colorado (Kantai Collection)
This is the dataset of colorado (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `blonde_hair, short_hair, braid, blue_eyes, breasts, large_breasts, side_braids, hat, headgear, garrison_cap, grey_headwear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 547.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/colorado_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 332.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/colorado_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1203 | 725.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/colorado_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 495.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/colorado_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1203 | 985.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/colorado_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/colorado_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, black_gloves, black_pantyhose, blue_necktie, capelet, elbow_gloves, grey_dress, open_mouth, pleated_dress, sideboob, sleeveless, solo, white_shirt, simple_background, smile, white_background, looking_at_viewer, cowboy_shot |
| 1 | 5 |  |  |  |  |  | 1girl, black_gloves, black_pantyhose, blue_necktie, cannon, capelet, elbow_gloves, grey_dress, looking_at_viewer, machinery, pleated_dress, rigging, sideboob, sleeveless, smile, solo, turret, white_shirt, open_mouth, hand_on_own_chest, star_(symbol) |
| 2 | 7 |  |  |  |  |  | 1girl, black_gloves, blue_necktie, capelet, elbow_gloves, sideboob, sleeveless, solo, white_shirt, dated, grey_dress, looking_at_viewer, one-hour_drawing_challenge, simple_background, upper_body, white_background, smile |
| 3 | 18 |  |  |  |  |  | detached_collar, fake_animal_ears, playboy_bunny, rabbit_ears, 1girl, solo, alternate_costume, strapless_leotard, wrist_cuffs, simple_background, white_background, rabbit_tail, black_pantyhose, blue_leotard, cowboy_shot, looking_at_viewer, open_mouth, blush, bowtie, cleavage, necktie, smile |
| 4 | 7 |  |  |  |  |  | 1girl, alternate_costume, looking_at_viewer, solo, simple_background, white_background, cleavage, collarbone, sweater, upper_body, long_sleeves, smile |
| 5 | 21 |  |  |  |  |  | 1girl, solo, blush, looking_at_viewer, collarbone, navel, simple_background, bikini, white_background, cleavage, cowboy_shot, smile |
| 6 | 15 |  |  |  |  |  | 1girl, solo, looking_at_viewer, cowboy_shot, open_mouth, competition_swimsuit, covered_navel, simple_background, blue_one-piece_swimsuit, collarbone, white_background, blush, dated, twitter_username, two-tone_swimsuit |
| 7 | 9 |  |  |  |  |  | sailor_dress, white_dress, blue_sailor_collar, cosplay, short_sleeves, 1girl, sailor_hat, simple_background, solo, white_background, white_headwear, blush, looking_at_viewer, cowboy_shot, white_gloves |
| 8 | 10 |  |  |  |  |  | 1girl, pleated_skirt, serafuku, solo, alternate_costume, looking_at_viewer, simple_background, white_background, blue_sailor_collar, cosplay, cowboy_shot, long_sleeves, open_mouth, blue_neckerchief, blue_skirt, blush, cleavage, white_sailor_collar, white_shirt, white_skirt |
| 9 | 5 |  |  |  |  |  | 1girl, black_panties, blush, crop_top, elbow_gloves, highleg_panties, serafuku, shimakaze_(kancolle)_(cosplay), solo, white_gloves, microskirt, navel, blue_sailor_collar, blue_skirt, collarbone, cowboy_shot, neckerchief, open_mouth, black_hairband, cleavage, looking_at_viewer, pleated_skirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_gloves | black_pantyhose | blue_necktie | capelet | elbow_gloves | grey_dress | open_mouth | pleated_dress | sideboob | sleeveless | solo | white_shirt | simple_background | smile | white_background | looking_at_viewer | cowboy_shot | cannon | machinery | rigging | turret | hand_on_own_chest | star_(symbol) | dated | one-hour_drawing_challenge | upper_body | detached_collar | fake_animal_ears | playboy_bunny | rabbit_ears | alternate_costume | strapless_leotard | wrist_cuffs | rabbit_tail | blue_leotard | blush | bowtie | cleavage | necktie | collarbone | sweater | long_sleeves | navel | bikini | competition_swimsuit | covered_navel | blue_one-piece_swimsuit | twitter_username | two-tone_swimsuit | sailor_dress | white_dress | blue_sailor_collar | cosplay | short_sleeves | sailor_hat | white_headwear | white_gloves | pleated_skirt | serafuku | blue_neckerchief | blue_skirt | white_sailor_collar | white_skirt | black_panties | crop_top | highleg_panties | shimakaze_(kancolle)_(cosplay) | microskirt | neckerchief | black_hairband |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:------------------|:---------------|:----------|:---------------|:-------------|:-------------|:----------------|:-----------|:-------------|:-------|:--------------|:--------------------|:--------|:-------------------|:--------------------|:--------------|:---------|:------------|:----------|:---------|:--------------------|:----------------|:--------|:-----------------------------|:-------------|:------------------|:-------------------|:----------------|:--------------|:--------------------|:--------------------|:--------------|:--------------|:---------------|:--------|:---------|:-----------|:----------|:-------------|:----------|:---------------|:--------|:---------|:-----------------------|:----------------|:--------------------------|:-------------------|:--------------------|:---------------|:--------------|:---------------------|:----------|:----------------|:-------------|:-----------------|:---------------|:----------------|:-----------|:-------------------|:-------------|:----------------------|:--------------|:----------------|:-----------|:------------------|:---------------------------------|:-------------|:--------------|:-----------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | X | | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | | X | X | X | X | | | X | X | X | X | X | X | X | X | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 18 |  |  |  |  |  | X | | X | | | | | X | | | | X | | X | X | X | X | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | | | | | | | | | | | X | | X | X | X | X | | | | | | | | | | X | | | | | X | | | | | | | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 21 |  |  |  |  |  | X | | | | | | | | | | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | X | | X | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 15 |  |  |  |  |  | X | | | | | | | X | | | | X | | X | | X | X | X | | | | | | | X | | | | | | | | | | | | X | | | | X | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 7 | 9 |  |  |  |  |  | X | | | | | | | | | | | X | | X | | X | X | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 8 | 10 |  |  |  |  |  | X | | | | | | | X | | | | X | X | X | | X | X | X | | | | | | | | | | | | | | X | | | | | X | | X | | | | X | | | | | | | | | | X | X | | | | | X | X | X | X | X | X | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | | | | | X | | X | | | | X | | | | | X | X | | | | | | | | | | | | | | | | | | | X | | X | | X | | | X | | | | | | | | | X | | | | | X | X | X | | X | | | X | X | X | X | X | X | X |
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-119000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 963318
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
emozilla/Long-Data-Collections-Pretrain-Without-Books | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 236565210292
num_examples: 9383848
download_size: 25749677954
dataset_size: 236565210292
---
# Dataset Card for "Long-Data-Collections-Pretrain-Without-Books"
Paraquet version of the pretrain split of [togethercomputer/Long-Data-Collections](https://huggingface.co/datasets/togethercomputer/Long-Data-Collections) WITHOUT books
Statistics (in # of characters): `total_len: 236088622215, average_len: 25159.041601590307`
|
communityai/HuggingFaceH4___capybara | ---
dataset_info:
features:
- name: source
dtype: string
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 72710513.0
num_examples: 15806
download_size: 37286202
dataset_size: 72710513.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
louisbrulenaudet/code-postes-communications-electroniques | ---
license: apache-2.0
language:
- fr
multilinguality:
- monolingual
tags:
- finetuning
- legal
- french law
- droit français
- Code des postes et des communications électroniques
source_datasets:
- original
pretty_name: Code des postes et des communications électroniques
task_categories:
- text-generation
- table-question-answering
- summarization
- text-retrieval
- question-answering
- text-classification
size_categories:
- 1K<n<10K
---
# Code des postes et des communications électroniques, non-instruct (2024-04-15)
This project focuses on fine-tuning pre-trained language models to create efficient and accurate models for legal practice.
Fine-tuning is the process of adapting a pre-trained model to perform specific tasks or cater to particular domains. It involves adjusting the model's parameters through a further round of training on task-specific or domain-specific data. While conventional fine-tuning strategies involve supervised learning with labeled data, instruction-based fine-tuning introduces a more structured and interpretable approach.
Instruction-based fine-tuning leverages the power of human-provided instructions to guide the model's behavior. These instructions can be in the form of text prompts, prompts with explicit task descriptions, or a combination of both. This approach allows for a more controlled and context-aware interaction with the LLM, making it adaptable to a multitude of specialized tasks.
Instruction-based fine-tuning significantly enhances the performance of LLMs in the following ways:
- Task-Specific Adaptation: LLMs, when fine-tuned with specific instructions, exhibit remarkable adaptability to diverse tasks. They can switch seamlessly between translation, summarization, and question-answering, guided by the provided instructions.
- Reduced Ambiguity: Traditional LLMs might generate ambiguous or contextually inappropriate responses. Instruction-based fine-tuning allows for a clearer and more context-aware generation, reducing the likelihood of nonsensical outputs.
- Efficient Knowledge Transfer: Instructions can encapsulate domain-specific knowledge, enabling LLMs to benefit from expert guidance. This knowledge transfer is particularly valuable in fields like tax practice, law, medicine, and more.
- Interpretability: Instruction-based fine-tuning also makes LLM behavior more interpretable. Since the instructions are human-readable, it becomes easier to understand and control model outputs.
- Adaptive Behavior: LLMs, post instruction-based fine-tuning, exhibit adaptive behavior that is responsive to both explicit task descriptions and implicit cues within the provided text.
## Concurrent reading of the LegalKit
To use all the legal data published on LegalKit, you can use this code snippet:
```python
# -*- coding: utf-8 -*-
import concurrent.futures
import os
import datasets
from tqdm.notebook import tqdm
def dataset_loader(
name:str,
streaming:bool=True
) -> datasets.Dataset:
"""
Helper function to load a single dataset in parallel.
Parameters
----------
name : str
Name of the dataset to be loaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
dataset : datasets.Dataset
Loaded dataset object.
Raises
------
Exception
If an error occurs during dataset loading.
"""
try:
return datasets.load_dataset(
name,
split="train",
streaming=streaming
)
except Exception as exc:
logging.error(f"Error loading dataset {name}: {exc}")
return None
def load_datasets(
req:list,
streaming:bool=True
) -> list:
"""
Downloads datasets specified in a list and creates a list of loaded datasets.
Parameters
----------
req : list
A list containing the names of datasets to be downloaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
datasets_list : list
A list containing loaded datasets as per the requested names provided in 'req'.
Raises
------
Exception
If an error occurs during dataset loading or processing.
Examples
--------
>>> datasets = load_datasets(["dataset1", "dataset2"], streaming=False)
"""
datasets_list = []
with concurrent.futures.ThreadPoolExecutor() as executor:
future_to_dataset = {executor.submit(dataset_loader, name): name for name in req}
for future in tqdm(concurrent.futures.as_completed(future_to_dataset), total=len(req)):
name = future_to_dataset[future]
try:
dataset = future.result()
if dataset:
datasets_list.append(dataset)
except Exception as exc:
logging.error(f"Error processing dataset {name}: {exc}")
return datasets_list
req = [
"louisbrulenaudet/code-artisanat",
"louisbrulenaudet/code-action-sociale-familles",
# ...
]
datasets_list = load_datasets(
req=req,
streaming=True
)
dataset = datasets.concatenate_datasets(
datasets_list
)
```
## Dataset generation
This JSON file is a list of dictionaries, each dictionary contains the following fields:
- `instruction`: `string`, presenting the instruction linked to the element.
- `input`: `string`, signifying the input details for the element.
- `output`: `string`, indicating the output information for the element.
- `start`: `string`, the date of entry into force of the article.
- `expiration`: `string`, the date of expiration of the article.
- `num`: `string`, the id of the article.
We used the following list of instructions for generating the dataset:
```python
instructions = [
"Compose l'intégralité de l'article sous forme écrite.",
"Écris la totalité du contenu de l'article.",
"Formule la totalité du texte présent dans l'article.",
"Produis l'intégralité de l'article en écriture.",
"Développe l'article dans son ensemble par écrit.",
"Génère l'ensemble du texte contenu dans l'article.",
"Formule le contenu intégral de l'article en entier.",
"Rédige la totalité du texte de l'article en entier.",
"Compose l'intégralité du contenu textuel de l'article.",
"Rédige l'ensemble du texte qui constitue l'article.",
"Formule l'article entier dans son contenu écrit.",
"Composez l'intégralité de l'article sous forme écrite.",
"Écrivez la totalité du contenu de l'article.",
"Formulez la totalité du texte présent dans l'article.",
"Développez l'article dans son ensemble par écrit.",
"Générez l'ensemble du texte contenu dans l'article.",
"Formulez le contenu intégral de l'article en entier.",
"Rédigez la totalité du texte de l'article en entier.",
"Composez l'intégralité du contenu textuel de l'article.",
"Écrivez l'article dans son intégralité en termes de texte.",
"Rédigez l'ensemble du texte qui constitue l'article.",
"Formulez l'article entier dans son contenu écrit.",
"Composer l'intégralité de l'article sous forme écrite.",
"Écrire la totalité du contenu de l'article.",
"Formuler la totalité du texte présent dans l'article.",
"Produire l'intégralité de l'article en écriture.",
"Développer l'article dans son ensemble par écrit.",
"Générer l'ensemble du texte contenu dans l'article.",
"Formuler le contenu intégral de l'article en entier.",
"Rédiger la totalité du texte de l'article en entier.",
"Composer l'intégralité du contenu textuel de l'article.",
"Rédiger l'ensemble du texte qui constitue l'article.",
"Formuler l'article entier dans son contenu écrit.",
"Quelles sont les dispositions de l'article ?",
"Quelles dispositions sont incluses dans l'article ?",
"Quelles sont les dispositions énoncées dans l'article ?",
"Quel est le texte intégral de l'article ?",
"Quelle est la lettre de l'article ?"
]
```
## Feedback
If you have any feedback, please reach out at [louisbrulenaudet@icloud.com](mailto:louisbrulenaudet@icloud.com). |
KatMarie/euparl_test | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 22103017
num_examples: 133599
download_size: 11392783
dataset_size: 22103017
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "euparl_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/python3-standardized_cluster_22 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 46213670
num_examples: 4452
download_size: 11140323
dataset_size: 46213670
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "python3-standardized_cluster_22"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
quincyqiang/test | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- acceptability-classification
- natural-language-inference
- semantic-similarity-scoring
- sentiment-classification
- text-scoring
paperswithcode_id: glue
pretty_name: GLUE (General Language Understanding Evaluation benchmark)
train-eval-index:
- config: cola
task: text-classification
task_id: binary_classification
splits:
train_split: train
eval_split: validation
col_mapping:
sentence: text
label: target
- config: sst2
task: text-classification
task_id: binary_classification
splits:
train_split: train
eval_split: validation
col_mapping:
sentence: text
label: target
- config: mrpc
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation
col_mapping:
sentence1: text1
sentence2: text2
label: target
- config: qqp
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation
col_mapping:
question1: text1
question2: text2
label: target
- config: stsb
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation
col_mapping:
sentence1: text1
sentence2: text2
label: target
- config: mnli
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation_matched
col_mapping:
premise: text1
hypothesis: text2
label: target
- config: mnli_mismatched
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation
col_mapping:
premise: text1
hypothesis: text2
label: target
- config: mnli_matched
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation
col_mapping:
premise: text1
hypothesis: text2
label: target
- config: qnli
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation
col_mapping:
question: text1
sentence: text2
label: target
- config: rte
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation
col_mapping:
sentence1: text1
sentence2: text2
label: target
- config: wnli
task: text-classification
task_id: natural_language_inference
splits:
train_split: train
eval_split: validation
col_mapping:
sentence1: text1
sentence2: text2
label: target
configs:
- ax
- cola
- mnli
- mnli_matched
- mnli_mismatched
- mrpc
- qnli
- qqp
- rte
- sst2
- stsb
- wnli
tags:
- qa-nli
- coreference-nli
- paraphrase-identification
---
# Dataset Card for GLUE
## Table of Contents
- [Dataset Card for GLUE](#dataset-card-for-glue)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [ax](#ax)
- [cola](#cola)
- [mnli](#mnli)
- [mnli_matched](#mnli_matched)
- [mnli_mismatched](#mnli_mismatched)
- [mrpc](#mrpc)
- [qnli](#qnli)
- [qqp](#qqp)
- [rte](#rte)
- [sst2](#sst2)
- [stsb](#stsb)
- [wnli](#wnli)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [ax](#ax-1)
- [cola](#cola-1)
- [mnli](#mnli-1)
- [mnli_matched](#mnli_matched-1)
- [mnli_mismatched](#mnli_mismatched-1)
- [mrpc](#mrpc-1)
- [qnli](#qnli-1)
- [qqp](#qqp-1)
- [rte](#rte-1)
- [sst2](#sst2-1)
- [stsb](#stsb-1)
- [wnli](#wnli-1)
- [Data Fields](#data-fields)
- [ax](#ax-2)
- [cola](#cola-2)
- [mnli](#mnli-2)
- [mnli_matched](#mnli_matched-2)
- [mnli_mismatched](#mnli_mismatched-2)
- [mrpc](#mrpc-2)
- [qnli](#qnli-2)
- [qqp](#qqp-2)
- [rte](#rte-2)
- [sst2](#sst2-2)
- [stsb](#stsb-2)
- [wnli](#wnli-2)
- [Data Splits](#data-splits)
- [ax](#ax-3)
- [cola](#cola-3)
- [mnli](#mnli-3)
- [mnli_matched](#mnli_matched-3)
- [mnli_mismatched](#mnli_mismatched-3)
- [mrpc](#mrpc-3)
- [qnli](#qnli-3)
- [qqp](#qqp-3)
- [rte](#rte-3)
- [sst2](#sst2-3)
- [stsb](#stsb-3)
- [wnli](#wnli-3)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://nyu-mll.github.io/CoLA/](https://nyu-mll.github.io/CoLA/)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 955.33 MB
- **Size of the generated dataset:** 229.68 MB
- **Total amount of disk used:** 1185.01 MB
### Dataset Summary
GLUE, the General Language Understanding Evaluation benchmark (https://gluebenchmark.com/) is a collection of resources for training, evaluating, and analyzing natural language understanding systems.
### Supported Tasks and Leaderboards
The leaderboard for the GLUE benchmark can be found [at this address](https://gluebenchmark.com/). It comprises the following tasks:
#### ax
A manually-curated evaluation dataset for fine-grained analysis of system performance on a broad range of linguistic phenomena. This dataset evaluates sentence understanding through Natural Language Inference (NLI) problems. Use a model trained on MulitNLI to produce predictions for this dataset.
#### cola
The Corpus of Linguistic Acceptability consists of English acceptability judgments drawn from books and journal articles on linguistic theory. Each example is a sequence of words annotated with whether it is a grammatical English sentence.
#### mnli
The Multi-Genre Natural Language Inference Corpus is a crowdsourced collection of sentence pairs with textual entailment annotations. Given a premise sentence and a hypothesis sentence, the task is to predict whether the premise entails the hypothesis (entailment), contradicts the hypothesis (contradiction), or neither (neutral). The premise sentences are gathered from ten different sources, including transcribed speech, fiction, and government reports. The authors of the benchmark use the standard test set, for which they obtained private labels from the RTE authors, and evaluate on both the matched (in-domain) and mismatched (cross-domain) section. They also uses and recommend the SNLI corpus as 550k examples of auxiliary training data.
#### mnli_matched
The matched validation and test splits from MNLI. See the "mnli" BuilderConfig for additional information.
#### mnli_mismatched
The mismatched validation and test splits from MNLI. See the "mnli" BuilderConfig for additional information.
#### mrpc
The Microsoft Research Paraphrase Corpus (Dolan & Brockett, 2005) is a corpus of sentence pairs automatically extracted from online news sources, with human annotations for whether the sentences in the pair are semantically equivalent.
#### qnli
The Stanford Question Answering Dataset is a question-answering dataset consisting of question-paragraph pairs, where one of the sentences in the paragraph (drawn from Wikipedia) contains the answer to the corresponding question (written by an annotator). The authors of the benchmark convert the task into sentence pair classification by forming a pair between each question and each sentence in the corresponding context, and filtering out pairs with low lexical overlap between the question and the context sentence. The task is to determine whether the context sentence contains the answer to the question. This modified version of the original task removes the requirement that the model select the exact answer, but also removes the simplifying assumptions that the answer is always present in the input and that lexical overlap is a reliable cue.
#### qqp
The Quora Question Pairs2 dataset is a collection of question pairs from the community question-answering website Quora. The task is to determine whether a pair of questions are semantically equivalent.
#### rte
The Recognizing Textual Entailment (RTE) datasets come from a series of annual textual entailment challenges. The authors of the benchmark combined the data from RTE1 (Dagan et al., 2006), RTE2 (Bar Haim et al., 2006), RTE3 (Giampiccolo et al., 2007), and RTE5 (Bentivogli et al., 2009). Examples are constructed based on news and Wikipedia text. The authors of the benchmark convert all datasets to a two-class split, where for three-class datasets they collapse neutral and contradiction into not entailment, for consistency.
#### sst2
The Stanford Sentiment Treebank consists of sentences from movie reviews and human annotations of their sentiment. The task is to predict the sentiment of a given sentence. It uses the two-way (positive/negative) class split, with only sentence-level labels.
#### stsb
The Semantic Textual Similarity Benchmark (Cer et al., 2017) is a collection of sentence pairs drawn from news headlines, video and image captions, and natural language inference data. Each pair is human-annotated with a similarity score from 1 to 5.
#### wnli
The Winograd Schema Challenge (Levesque et al., 2011) is a reading comprehension task in which a system must read a sentence with a pronoun and select the referent of that pronoun from a list of choices. The examples are manually constructed to foil simple statistical methods: Each one is contingent on contextual information provided by a single word or phrase in the sentence. To convert the problem into sentence pair classification, the authors of the benchmark construct sentence pairs by replacing the ambiguous pronoun with each possible referent. The task is to predict if the sentence with the pronoun substituted is entailed by the original sentence. They use a small evaluation set consisting of new examples derived from fiction books that was shared privately by the authors of the original corpus. While the included training set is balanced between two classes, the test set is imbalanced between them (65% not entailment). Also, due to a data quirk, the development set is adversarial: hypotheses are sometimes shared between training and development examples, so if a model memorizes the training examples, they will predict the wrong label on corresponding development set example. As with QNLI, each example is evaluated separately, so there is not a systematic correspondence between a model's score on this task and its score on the unconverted original task. The authors of the benchmark call converted dataset WNLI (Winograd NLI).
### Languages
The language data in GLUE is in English (BCP-47 `en`)
## Dataset Structure
### Data Instances
#### ax
- **Size of downloaded dataset files:** 0.21 MB
- **Size of the generated dataset:** 0.23 MB
- **Total amount of disk used:** 0.44 MB
An example of 'test' looks as follows.
```
{
"premise": "The cat sat on the mat.",
"hypothesis": "The cat did not sit on the mat.",
"label": -1,
"idx: 0
}
```
#### cola
- **Size of downloaded dataset files:** 0.36 MB
- **Size of the generated dataset:** 0.58 MB
- **Total amount of disk used:** 0.94 MB
An example of 'train' looks as follows.
```
{
"sentence": "Our friends won't buy this analysis, let alone the next one we propose.",
"label": 1,
"id": 0
}
```
#### mnli
- **Size of downloaded dataset files:** 298.29 MB
- **Size of the generated dataset:** 78.65 MB
- **Total amount of disk used:** 376.95 MB
An example of 'train' looks as follows.
```
{
"premise": "Conceptually cream skimming has two basic dimensions - product and geography.",
"hypothesis": "Product and geography are what make cream skimming work.",
"label": 1,
"idx": 0
}
```
#### mnli_matched
- **Size of downloaded dataset files:** 298.29 MB
- **Size of the generated dataset:** 3.52 MB
- **Total amount of disk used:** 301.82 MB
An example of 'test' looks as follows.
```
{
"premise": "Hierbas, ans seco, ans dulce, and frigola are just a few names worth keeping a look-out for.",
"hypothesis": "Hierbas is a name worth looking out for.",
"label": -1,
"idx": 0
}
```
#### mnli_mismatched
- **Size of downloaded dataset files:** 298.29 MB
- **Size of the generated dataset:** 3.73 MB
- **Total amount of disk used:** 302.02 MB
An example of 'test' looks as follows.
```
{
"premise": "What have you decided, what are you going to do?",
"hypothesis": "So what's your decision?,
"label": -1,
"idx": 0
}
```
#### mrpc
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### qnli
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### qqp
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### rte
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### sst2
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### stsb
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### wnli
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Data Fields
The data fields are the same among all splits.
#### ax
- `premise`: a `string` feature.
- `hypothesis`: a `string` feature.
- `label`: a classification label, with possible values including `entailment` (0), `neutral` (1), `contradiction` (2).
- `idx`: a `int32` feature.
#### cola
- `sentence`: a `string` feature.
- `label`: a classification label, with possible values including `unacceptable` (0), `acceptable` (1).
- `idx`: a `int32` feature.
#### mnli
- `premise`: a `string` feature.
- `hypothesis`: a `string` feature.
- `label`: a classification label, with possible values including `entailment` (0), `neutral` (1), `contradiction` (2).
- `idx`: a `int32` feature.
#### mnli_matched
- `premise`: a `string` feature.
- `hypothesis`: a `string` feature.
- `label`: a classification label, with possible values including `entailment` (0), `neutral` (1), `contradiction` (2).
- `idx`: a `int32` feature.
#### mnli_mismatched
- `premise`: a `string` feature.
- `hypothesis`: a `string` feature.
- `label`: a classification label, with possible values including `entailment` (0), `neutral` (1), `contradiction` (2).
- `idx`: a `int32` feature.
#### mrpc
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### qnli
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### qqp
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### rte
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### sst2
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### stsb
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### wnli
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Data Splits
#### ax
| |test|
|---|---:|
|ax |1104|
#### cola
| |train|validation|test|
|----|----:|---------:|---:|
|cola| 8551| 1043|1063|
#### mnli
| |train |validation_matched|validation_mismatched|test_matched|test_mismatched|
|----|-----:|-----------------:|--------------------:|-----------:|--------------:|
|mnli|392702| 9815| 9832| 9796| 9847|
#### mnli_matched
| |validation|test|
|------------|---------:|---:|
|mnli_matched| 9815|9796|
#### mnli_mismatched
| |validation|test|
|---------------|---------:|---:|
|mnli_mismatched| 9832|9847|
#### mrpc
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### qnli
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### qqp
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### rte
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### sst2
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### stsb
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### wnli
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@article{warstadt2018neural,
title={Neural Network Acceptability Judgments},
author={Warstadt, Alex and Singh, Amanpreet and Bowman, Samuel R},
journal={arXiv preprint arXiv:1805.12471},
year={2018}
}
@inproceedings{wang2019glue,
title={{GLUE}: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding},
author={Wang, Alex and Singh, Amanpreet and Michael, Julian and Hill, Felix and Levy, Omer and Bowman, Samuel R.},
note={In the Proceedings of ICLR.},
year={2019}
}
Note that each GLUE dataset has its own citation. Please see the source to see
the correct citation for each contained dataset.
```
### Contributions
Thanks to [@patpizio](https://github.com/patpizio), [@jeswan](https://github.com/jeswan), [@thomwolf](https://github.com/thomwolf), [@patrickvonplaten](https://github.com/patrickvonplaten), [@mariamabarham](https://github.com/mariamabarham) for adding this dataset. |
open-llm-leaderboard/details_Cartinoe5930__SOLAR-DUS-implement | ---
pretty_name: Evaluation run of Cartinoe5930/SOLAR-DUS-implement
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Cartinoe5930/SOLAR-DUS-implement](https://huggingface.co/Cartinoe5930/SOLAR-DUS-implement)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Cartinoe5930__SOLAR-DUS-implement\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-16T14:37:28.066845](https://huggingface.co/datasets/open-llm-leaderboard/details_Cartinoe5930__SOLAR-DUS-implement/blob/main/results_2024-01-16T14-37-28.066845.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6312296500454484,\n\
\ \"acc_stderr\": 0.0323614114970197,\n \"acc_norm\": 0.6390797710653894,\n\
\ \"acc_norm_stderr\": 0.033030038319899674,\n \"mc1\": 0.2533659730722154,\n\
\ \"mc1_stderr\": 0.01522589934082683,\n \"mc2\": 0.4071642776487792,\n\
\ \"mc2_stderr\": 0.01422601728098354\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5597269624573379,\n \"acc_stderr\": 0.014506769524804241,\n\
\ \"acc_norm\": 0.5955631399317406,\n \"acc_norm_stderr\": 0.014342036483436177\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6122286397132045,\n\
\ \"acc_stderr\": 0.004862461799370392,\n \"acc_norm\": 0.811790479984067,\n\
\ \"acc_norm_stderr\": 0.003900805416736719\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.04218506215368881,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.04218506215368881\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.039105257528497236,\n\
\ \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.039105257528497236\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n\
\ \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n\
\ \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.049020713000019756,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.049020713000019756\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520196,\n \"\
acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520196\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.023904914311782655,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.023904914311782655\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.0303137105381989,\n \"acc_norm\"\
: 0.7626262626262627,\n \"acc_norm_stderr\": 0.0303137105381989\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015184,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015184\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.02385479568097113,\n \
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.02385479568097113\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8201834862385321,\n \"acc_stderr\": 0.01646534546739154,\n \"\
acc_norm\": 0.8201834862385321,\n \"acc_norm_stderr\": 0.01646534546739154\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6018518518518519,\n \"acc_stderr\": 0.03338473403207401,\n \"\
acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.03338473403207401\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676166,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676166\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098825,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098825\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n\
\ \"acc_stderr\": 0.013927751372001503,\n \"acc_norm\": 0.8135376756066411,\n\
\ \"acc_norm_stderr\": 0.013927751372001503\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.02500931379006971,\n\
\ \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.02500931379006971\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31843575418994413,\n\
\ \"acc_stderr\": 0.015581008080360276,\n \"acc_norm\": 0.31843575418994413,\n\
\ \"acc_norm_stderr\": 0.015581008080360276\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6975308641975309,\n \"acc_stderr\": 0.025557653981868055,\n\
\ \"acc_norm\": 0.6975308641975309,\n \"acc_norm_stderr\": 0.025557653981868055\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.42242503259452413,\n \"acc_stderr\": 0.012615600475734921,\n\
\ \"acc_norm\": 0.42242503259452413,\n \"acc_norm_stderr\": 0.012615600475734921\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.7169117647058824,\n \"acc_stderr\": 0.02736586113151381,\n \"\
acc_norm\": 0.7169117647058824,\n \"acc_norm_stderr\": 0.02736586113151381\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6568627450980392,\n \"acc_stderr\": 0.01920660684882536,\n \
\ \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.01920660684882536\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2533659730722154,\n\
\ \"mc1_stderr\": 0.01522589934082683,\n \"mc2\": 0.4071642776487792,\n\
\ \"mc2_stderr\": 0.01422601728098354\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7647987371744278,\n \"acc_stderr\": 0.011920008163650877\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2699014404852161,\n \
\ \"acc_stderr\": 0.012227442856468897\n }\n}\n```"
repo_url: https://huggingface.co/Cartinoe5930/SOLAR-DUS-implement
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|arc:challenge|25_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|arc:challenge|25_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|gsm8k|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|gsm8k|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hellaswag|10_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hellaswag|10_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T14-31-52.747205.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T14-37-28.066845.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T14-37-28.066845.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- '**/details_harness|winogrande|5_2024-01-16T14-31-52.747205.parquet'
- split: 2024_01_16T14_37_28.066845
path:
- '**/details_harness|winogrande|5_2024-01-16T14-37-28.066845.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-16T14-37-28.066845.parquet'
- config_name: results
data_files:
- split: 2024_01_16T14_31_52.747205
path:
- results_2024-01-16T14-31-52.747205.parquet
- split: 2024_01_16T14_37_28.066845
path:
- results_2024-01-16T14-37-28.066845.parquet
- split: latest
path:
- results_2024-01-16T14-37-28.066845.parquet
---
# Dataset Card for Evaluation run of Cartinoe5930/SOLAR-DUS-implement
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Cartinoe5930/SOLAR-DUS-implement](https://huggingface.co/Cartinoe5930/SOLAR-DUS-implement) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Cartinoe5930__SOLAR-DUS-implement",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T14:37:28.066845](https://huggingface.co/datasets/open-llm-leaderboard/details_Cartinoe5930__SOLAR-DUS-implement/blob/main/results_2024-01-16T14-37-28.066845.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6312296500454484,
"acc_stderr": 0.0323614114970197,
"acc_norm": 0.6390797710653894,
"acc_norm_stderr": 0.033030038319899674,
"mc1": 0.2533659730722154,
"mc1_stderr": 0.01522589934082683,
"mc2": 0.4071642776487792,
"mc2_stderr": 0.01422601728098354
},
"harness|arc:challenge|25": {
"acc": 0.5597269624573379,
"acc_stderr": 0.014506769524804241,
"acc_norm": 0.5955631399317406,
"acc_norm_stderr": 0.014342036483436177
},
"harness|hellaswag|10": {
"acc": 0.6122286397132045,
"acc_stderr": 0.004862461799370392,
"acc_norm": 0.811790479984067,
"acc_norm_stderr": 0.003900805416736719
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368881,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368881
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.039105257528497236,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.039105257528497236
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.037738099906869334,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.037738099906869334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.049020713000019756,
"acc_norm": 0.39,
"acc_norm_stderr": 0.049020713000019756
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520196,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520196
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782655,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782655
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.0303137105381989,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.0303137105381989
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015184,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.02385479568097113,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.02385479568097113
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8201834862385321,
"acc_stderr": 0.01646534546739154,
"acc_norm": 0.8201834862385321,
"acc_norm_stderr": 0.01646534546739154
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.03338473403207401,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.03338473403207401
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676166,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676166
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098825,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098825
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.013927751372001503,
"acc_norm": 0.8135376756066411,
"acc_norm_stderr": 0.013927751372001503
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.02500931379006971,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.02500931379006971
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31843575418994413,
"acc_stderr": 0.015581008080360276,
"acc_norm": 0.31843575418994413,
"acc_norm_stderr": 0.015581008080360276
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6975308641975309,
"acc_stderr": 0.025557653981868055,
"acc_norm": 0.6975308641975309,
"acc_norm_stderr": 0.025557653981868055
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42242503259452413,
"acc_stderr": 0.012615600475734921,
"acc_norm": 0.42242503259452413,
"acc_norm_stderr": 0.012615600475734921
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7169117647058824,
"acc_stderr": 0.02736586113151381,
"acc_norm": 0.7169117647058824,
"acc_norm_stderr": 0.02736586113151381
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.01920660684882536,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.01920660684882536
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2533659730722154,
"mc1_stderr": 0.01522589934082683,
"mc2": 0.4071642776487792,
"mc2_stderr": 0.01422601728098354
},
"harness|winogrande|5": {
"acc": 0.7647987371744278,
"acc_stderr": 0.011920008163650877
},
"harness|gsm8k|5": {
"acc": 0.2699014404852161,
"acc_stderr": 0.012227442856468897
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
tmrmr/pessimistic_rlhf_jsai2024 | ---
license: mit
dataset_info:
features:
- name: text
dtype: string
- name: log_prob
dtype: float64
- name: perplexity
dtype: float64
- name: num_tokens
dtype: int64
splits:
- name: train
num_bytes: 413246
num_examples: 5000
- name: valid
num_bytes: 41314
num_examples: 500
- name: test
num_bytes: 41391
num_examples: 500
- name: unlabeled
num_bytes: 831779
num_examples: 10000
download_size: 537280
dataset_size: 1327730
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
- split: unlabeled
path: data/unlabeled-*
---
|
NorGLM/NO-BoolQ | ---
license: cc-by-sa-3.0
language:
- 'no'
---
## Dataset Card for NO-BoolQ ##
NO-BoolQ is machine translated from [Google Boolq dataset](https://huggingface.co/datasets/google/boolq). It is a question answering dataset split with train, test and validation set the same with it's original dataset.
This dataset belongs to NLEBench Norwegian benchmarks for evaluation on Norwegian Natrual Language Undersanding (NLU) tasks.
## Licensing Information
This dataset is built upon the existing datasets. We therefore follow its original license information.
## Citation Information
The dataset is from GLUE benchmark:
```
@inproceedings{wang2019glue,
title={{GLUE}: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding},
author={Wang, Alex and Singh, Amanpreet and Michael, Julian and Hill, Felix and Levy, Omer and Bowman, Samuel R.},
note={In the Proceedings of ICLR.},
year={2019}
}
``` |
DZN222/joaocaetano | ---
license: openrail
---
|
alexcom/analisis-sentimientos-textos-turisitcos-mx-tipoV2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 92924444
num_examples: 226531
- name: test
num_bytes: 10306957
num_examples: 25171
download_size: 63421013
dataset_size: 103231401
---
# Dataset Card for "analisis-sentimientos-textos-turisitcos-mx-tipoV2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Marxulia/asl_sign_languages_alphabets_v02 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
'4': E
'5': F
'6': G
'7': H
'8': I
'9': J
'10': K
'11': L
'12': M
'13': 'N'
'14': O
'15': P
'16': Q
'17': R
'18': S
'19': T
'20': U
'21': V
'22': W
'23': X
'24': 'Y'
'25': Z
splits:
- name: train
num_bytes: 5559518
num_examples: 520
download_size: 5494142
dataset_size: 5559518
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- image-classification
language:
- en
tags:
- code
size_categories:
- n<1K
--- |
joey234/mmlu-moral_disputes-neg | ---
dataset_info:
features:
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: question
dtype: string
splits:
- name: test
num_bytes: 105930
num_examples: 346
download_size: 60234
dataset_size: 105930
---
# Dataset Card for "mmlu-moral_disputes-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/Canadian_Speaking_English_Speech_Data_by_Mobile_Phone | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Canadian_Speaking_English_Speech_Data_by_Mobile_Phone
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/1047?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
466 native Canadian speakers involved, balanced for gender. The recording corpus is rich in content, and it covers a wide domain such as generic command and control category, human-machine interaction category; smart home category; in-car category. The transcription corpus has been manually proofread to ensure high accuracy.
For more details, please refer to the link: https://www.nexdata.ai/datasets/1047?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
Canadian English
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions |
iarata/PHCR-DB25 | ---
language:
- fa
pretty_name: Persian Historical Documents Handwritten Characters
size_categories:
- 1K<n<10K
tags:
- ocr
- character-recognition
- persian
- historical
- handwritten
- nastaliq
- character
---
# Persian Historical Documents Handwritten Characters
## Dataset Description
- **Model**: https://huggingface.co/iarata/Few-Shot-PHCR
- **Repository:** https://github.com/iarata/persian-docs-ocr
- **Paper:** https://doi.org/10.1007/978-3-031-53969-5_20
- **Point of Contact:** hajebrahimi.research [at] gmail [dot] com
### Summary
This dataset contains pre-processed images of Persian characters' contextual forms (except letter گ) from 5 handwritten Persian historical books written in Nastaliq script. The dataset contains 2775 images of 111 classes. The images are in TIFF format and have a resolution of 72 dpi. The images are in black and white and have a size of 395 × 395 pixels.
### Languages
Persian

## Dataset Structure
The dataset is structured as follows:
```
├── data
│ ├── 06a9_01.tif
│ ├── 06a9_02.tif
│ ├── 06a9_03.tif
│ ├── 06a9_04.tif
│ ├── 06a9_05.tif
│ ├── ...
│ ├── 06a9_25.tif
│ │
│ ├── 06cc_01.tif
│ ├── 06cc_02.tif
│ ├── 06cc_03.tif
│ ├── 06cc_04.tif
│ ├── 06cc_05.tif
│ ├── ...
│ ├── 06cc_25.tif
│ ├── ...
```
The naming of each image indicates the UTF-16 hexadecimal code ([Hex to String Decoder](https://dencode.com/en/string/hex)) of a character's contextual form followed by the number of the image. In the numbering, every 5 images are from a new book. The contextual form of every character is treated as a separate class resulting in 111 classes.
## Dataset Creation
For building this dataset 5 historical Persian books from the [Library of Congress](loc.gov)
### Source Data
The data was collected from 5 historical Persian books from the [Library of Congress](loc.gov). The books are as follows:
- [Shah-nameh by Firdausi](https://www.loc.gov/item/2012498868/)
- [Dīvān](https://www.loc.gov/item/2015481730/)
- [Kitāb-i Rūmī al-Mawlawī](https://www.loc.gov/item/2016397707)
- [Gulistān](https://www.loc.gov/item/2017406684/)
- [Qajar-era poetry](https://www.loc.gov/item/2017498320/)
The images were pre-processed using the following steps:
Images were first normalized to reduce noise from the background of the characters. The normalized image is then converted to a single-channel grayscale image. Following that, image thresholding is applied to the grayscale image to remove the characters' background. The thresholded image is binarized so that the pixel values greater than 0 become 255 (white), and pixels with a value of 0 (black) remain unchanged. Finally, the binarized image is inversed.
### Annotations
Before pre-processing the images the characters were cropped from the books and were saved with their UTF-16 hexadecimal code plus the number of the image (e.g. 06a9_01.tif).
#### Annotators:
- [Hajebrahimi Alireza](https://www.linkedin.com/in/alireza-hajebrahimi/)
- [Hajebrahimi Reyhaneh](https://www.linkedin.com/in/reyhaneh-hajebrahimi-2565451a0/)
### Citation Information
Hajebrahimi, A., Santoso, M.E., Kovacs, M., Kryssanov, V.V. (2024). Few-Shot Learning for Character Recognition in Persian Historical Documents. In: Nicosia, G., Ojha, V., La Malfa, E., La Malfa, G., Pardalos, P.M., Umeton, R. (eds) Machine Learning, Optimization, and Data Science. LOD 2023. Lecture Notes in Computer Science, vol 14505. Springer, Cham. https://doi.org/10.1007/978-3-031-53969-5_20
**BibTeX:**
```bibtex
@InProceedings{10.1007/978-3-031-53969-5_20,
author="Hajebrahimi, Alireza
and Santoso, Michael Evan
and Kovacs, Mate
and Kryssanov, Victor V.",
editor="Nicosia, Giuseppe
and Ojha, Varun
and La Malfa, Emanuele
and La Malfa, Gabriele
and Pardalos, Panos M.
and Umeton, Renato",
title="Few-Shot Learning for Character Recognition in Persian Historical Documents",
booktitle="Machine Learning, Optimization, and Data Science",
year="2024",
publisher="Springer Nature Switzerland",
address="Cham",
pages="259--273",
abstract="Digitizing historical documents is crucial for the preservation of cultural heritage. The digitization of documents written in Perso-Arabic scripts, however, presents multiple challenges. The Nastaliq calligraphy can be difficult to read even for a native speaker, and the four contextual forms of alphabet letters pose a complex task to current optical character recognition systems. To address these challenges, the presented study develops an approach for character recognition in Persian historical documents using few-shot learning with Siamese Neural Networks. A small, novel dataset is created from Persian historical documents for training and testing purposes. Experiments on the dataset resulted in a 94.75{\%} testing accuracy for the few-shot learning task, and a 67{\%} character recognition accuracy was observed on unseen documents for 111 distinct character classes.",
isbn="978-3-031-53969-5"
}
```
|
RGBD-SOD/COME15K | ---
dataset_info:
features:
- name: name
dtype: string
- name: rgb
dtype: image
- name: depth
dtype: image
- name: gt
dtype: image
splits:
- name: train
num_bytes: 2280732875.25
num_examples: 8025
- name: validation
num_bytes: 1256773656.2
num_examples: 4600
- name: test
num_bytes: 788633364.0
num_examples: 3000
download_size: 4343671184
dataset_size: 4326139895.45
---
# Dataset Card for "COME15K"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
seyoungsong/BBQ | ---
license: cc-by-4.0
---
# BBQ
Repository for the Bias Benchmark for QA dataset.
https://github.com/nyu-mll/BBQ
|
Zangs3011/no_robots_gpt2ChatFormated | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: category
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 29092450
num_examples: 9500
- name: test
num_bytes: 1560738
num_examples: 500
download_size: 18917122
dataset_size: 30653188
---
# Dataset Card for "no_robots_gpt2ChatFormated"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Alexator26/857_stickers_with_messy_bg | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: edit_prompt
dtype: string
- name: cartoonized_image
dtype: image
splits:
- name: train
num_bytes: 545505817.0
num_examples: 857
download_size: 545517656
dataset_size: 545505817.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.